Previous Page Table of Contents Next Page


4.1 Introduction

Much of what has been discussed in Chapter 3 could be said to have little significance to most of the world. Primary and secondary data collection is, in many senses, an expensive luxury-something which is only realistically applied to those small portions of the earth's surface which are characterized by being comparatively densely populated and which have a high G.N.P. per capita. The vast majority of the world is not like this. There are no structures or means to plan and carry out questionnaire surveys, to equip and organize time series evaluations for several parameters of water quality, temperature or quantity, or to differentiate between the relative accessibility of differing production inputs. Life is not organized at this scale. There has never been the need to gather and collate the type of information which would be of use to intending fish producers, or indeed to almost any entrepreneur.

However, this is changing. In Chapter 1 we outlined the crucial need to select and reserve land or water areas which would be suitable for aquaculture or inland fisheries, i.e. in order to best increase food supplies, employment opportunities and to increase the wealth of an area. Whilst it is beyond the means for most developing countries to intensively survey their territories for a large number of varied parameters using ground based techniques, this is quite within the means of remote sensing (RS) technologies. In this chapter we aim to first look at RS by examining its definition, its development and its methods, and then to examine how RS can be of value to aquaculture and inland fisheries. We shall also need to acknowledge that, although RS has brought many benefits to the spatial analyst, there are still many limitations with regard to its use. Because there are several recent FAO publications which provide background information on RS (Lantieri, 1988; FAO-RSC Series 47, 1988; FAO-RSC Series 49, 1989), we shall try to concentrate on its applications to the search for optimizing locations. We shall consider the integration of RS into GIS in Chapter 6, and a number of our case studies (in Chapter 7) specifically consider applications of RS to aquaculture, inland fisheries or related fields.

4.2 The Development of Remote Sensing

Remote sensing is “…concerned with the collection of data by a sensing device not in contact with the object being sensed, and the evaluation of the collected data, which is then termed information and is presented in map form or as statistics.” Howard, 1985). Clearly, the concept of RS covers a huge field - a field within science and technology which encompasses a vast “applications domain”, i.e. in the sense of inputs of applied science, applications in the processing field and in the sense of the ways in which RS outputs can be applied.

Since Butler et al (1988) have already outlined the historical growth of RS (for the FAO), we will confine our resumé of its development to a few key advances. The human eye is a remote sensor and, although we can capture images which may be stored in the brain and later retrieved, we can only reproduce them in a subjective sense. The eye can only capture visible radiation which occupies a very small part of the complete range of radiation (which is known as the electromagnetic spectrum). To overcome these deficiencies various instruments or systems have been invented and developed. We will be concerned here with those instruments or systems which capture data from an aerial perspective, i.e. allowing maps to be easily created.

As a means of capturing images, the camera was first developed in France in the 1830s, but it was not until 1858 that the first aerial photograph was taken from a captive balloon near Paris. During the rest of the 19th century advances were made in cameras and additional camera platforms were experimented with. The first photograph from an aeroplane was taken in 1909 over Centocelli in Italy. During the first World War aerial photography was utilized on a large and systematic scale, both in Europe and the Near East, with specially designed cameras and film processing techniques being developed. It was during this period that photo interpretation became a recognized field of expertise.

Civilian use of vertical aerial photography greatly improved during the 1920s and 1930s because of advances in both photographic methods and in the aeroplane as a platform. Aerial photography was used in the compilation of topographic maps, by geologists, for esters and planners, mainly in north America and Europe but occasionally to acquire information from more remote areas which might otherwise be unobtainable. During World War II further developments occurred, e.g.:

  1. The water penetration capability of aerial film was recognized which meant that bathymetric data could be acquired.
  2. Colour infrared film was developed for camouflage detection.
  3. Advances in radar technology permitted the development of smaller transmitting and receiving equipment, appropriate for airborne use.
  4. A large area of the Pacific war zone was photographically mapped.

During the 1940s and 1950s large-scale, complete country coverages were undertaken using black and white panchromatic aerial photography, i.e. for many of the colonial countries, especially in Africa. By the 1960s aerial photography had been operative long enough to allow for the study of spatial/temporal variations in the environment.

The period from the late 1950s has been extremely active for RS, with developments in the whole applications field occurring at an exponential rate. With satellite launches occurring regularly, following SPUTNIK 1 in 1957, the interest in RS concentrated on the use of this new and unique platform. In 1959 the first earth images were transmitted from EXPLORER 6 and the first meteorological satellite, TIROS-1, was launched in 1960. The next major advance for RS occurred with the launch of the Earth Resources Technology Satellite (ERTS-1, later renamed Landsat 1) in 1972. This was the first satellite designed to provide long-term, uniform global coverage, having the ability to transmit data gathered on a variety of instruments, for eventual mapping at scales of 1:250 000 to 1:1 000 000. Since Landsat 1 there has been a succession of earth monitoring satellites launched, first by the U.S.A. and the U.S.S.R., but more recently by other countries. Their equipment has become progressively more sophisticated allowing a greater range of imaged data to be interpreted at a more detailed spatial resolution (Travaglia, 1989).

As both Howard (1985) and Butler et al (1988) make clear, we should not let the blossoming satellite technologies mask the fact that airborne RS is still thriving and vital. Aircraft have a number of distinct advantages over satellites, mainly in terms of their flexibility of altitude, scheduling and payload. They do not have the same cloud cover problems that satellites have and they can provide low cost, excellent images of smaller areas. Howard (1985) estimates that airborne techniques, using colour infrared technology at high altitude, can allow more than 20 000 km2 to be photographed, and be thematically mapped at a scale of 1:25 000 to 1:100 000, each day.

It is the rapid surge in the electronics industry which has permitted the post 1960s boom in RS, i.e. because as well as providing for actual developments in data capture, data transmission, image processing, etc., it has spawned the computer technology advances which have been vital to all aspects of space science and have allowed the huge data streams to be efficiently handled. This surge in RS has also been aided by the influx of new ideas from a variety of related disciplines, by the availability of funding for space related activities and by the access to an increasing range of software and hardware. RS has a number of positive advantages over other sensing systems:

  1. It allows change to be monitored in a systematic and orderly way.
  2. It is efficient and very cost-effective in per km2 terms.
  3. It overcomes many data collection problems, e.g. in isolated areas and the fact that normal data collection may terminate at political boundaries.
  4. It can provide for instantaneous updating of information.

Jackson and Mason (1986) report that modern RS has now successfully overcome the problems which it had in the 1960s and 1970s of being “technology pushed”. As useful applications of RS imagery have been developed, especially in the fields of food production and environmental awareness, RS is now also being “user pulled”. This recent trend has been greatly helped by the growing ability to successfully integrate remotely sensed data into GIS - in fact, we would suggest that if it was not for the functionality offered by GIS, then the future for RS might be rather uncertain (see also Ehlers et al, 1989).

4.3 Electromagnetic Radiation - The Basis of Remote Sensing

“The aim of environmental remote sensing is to utilize sensors, which are mounted on aerial platforms, to identify and/or measure parameters of an object according to variations in the electromagnetic radiation (EMR) emitted by, or reflected from the object.” Contained within this statement are a number of concepts which will receive individual attention in the next three sections, in an attempt to clarify necessary RS principles. The interested reader should consult the FAO sources quoted in section 4.1 for further details.

Although we cannot see light (or sound) travelling, we know that it does so. This travel involves the transfer of energy through space or matter in the form of wave motions. The waves that make up EMR travel at a constant speed of 300 million meters per second. All objects reflect and radiate EMR. The amount of electromagnetic energy emitted is a function of the object's temperature - as temperatures increase, the intensity of the radiation emitted increases. There are a whole family of waves which collectively are called electromagnetic vibrations and which vary in their wavelength, and hence (since their speed is constant in space) their frequency. This family may be displayed as a spectrum of energies, hierarchically arranged by wave frequency or wave length (Figure 4.1). The spectrum covers a vast continuum of wavelengths as indicated, and it is usual to rather arbitrarily differentiate between the major wave bands based on certain properties such as their source, method of generation, means of detection, selected applications, etc. Only specific wavelength bands are of interest to RS, i.e. those forming the “windows of transmission”, because at these wavelengths filtering out of EMR by the atmosphere is at a minimum. These bands can be shown as in Table 4.1.

The energy which is sensed by the different RS systems is a function of various parameters which might affect the energy before it is received by the sensors. This is shown in Figure 4.2 which indicates that EMR can be natural, either reflected light and other radiations from the sun (Source 1) or emitted heat from the earth (Source 2), or it can be man-made such as from a power station or a radar system. The amount and type of radiation emitted or reflected depends upon incident energy (mainly from incoming solar radiation), the nature of the earth's surface and on the interaction with the earth's atmosphere.

Figure 4.1 The Electromagnetic Spectrum

Figure 4.1

The full electromagnetic spectrum is shown below, with the wavelength range marked in. The sources and detectors for each part are also shown.

Table 4.1 Wave Bands of the EMR Spectrum of Interest to Remote Sensing
Wave Band and
Wave Length
DetectorsSome Characteristics
0.4–0.7 um
Black & white plus colour photography.
T.V. camera.
Optical scanner.
High atmospheric scattering effect. Most EMR is reflected solar radiation therefore only used in day-light. Penetrates water.
Near Infrared
0.7–3.0 um
Infrared scanner.
Infrared thermography.
Multi-spectral scanner.Photography.
Optical scanner.
High reflectance of vegetation. Again solar energy reflected by surfaces.
Medium Infrared
3.0–8.0 um
as above.
8.0–1000 um
Side Looking Air-borne
Radar (SLAR).
Optical scanner.
Predominantly radiation emitted by the earth and atmosphere. Does not penetrate clouds.
1 mm–100 cm
Scanning radiometer.
Can penetrate clouds. Imagery acquired in active or passive mode - daytime or night-time.

Figure 4.2 The Key Features of the Remote Sensing Data Collection Process (after Curran, 1985)

Figure 4.2

4.3.1 Incident Energy

This comes mainly from the sun and, in the range of the visible and near infrared part of the spectrum, it is the proportion of the incident energy reflected by the “object” on the ground. When the energy sensed is in the range of thermal radiation it comes mainly from the emission of the “object” on the ground, which is itself a function of the sun's incident energy which has been absorbed by that object and then re-emitted as thermal radiation. Incident energy from the sun will vary with season or latitude (affecting the angle of the sun), with the length of time the sun has been shining and with the angle of the object on the ground. When analyzing remotely sensed data it is important to consider dates, time of acquisition and relief.

4.3.2 Effects of Atmosphere

The atmosphere can affect the amount of radiation received by the sensor because the atmosphere itself is heterogeneous, being made up of many gases as well as having dust particles and other pollutants. The atmosphere may scatter light in the visible band and absorb it in the ultraviolet and infrared bands. About 18% of the incident radiation in the atmosphere is absorbed or scattered and about 35% of the incoming solar energy is reflected by the earth and the atmosphere, including clouds. Scattering is caused by particles in the atmosphere reflecting the energy, and the intensity of the scattered EMR depends on the ratio of the wavelength to the size of the particles. Scattering caused by small particles is selective relative to the wavelength, affecting shorter wavelengths more; scattering related to large particles is non-selective, affecting all wavelengths. Because of scattering, the energy received by the sensor includes reflections from the atmosphere as well as from the target (object). Complex algorithms are needed to correct this effect. Atmospheric absorption reduces the amount of EMR reaching the sensor in some wavelength bands. Figure 4.3 shows the percentage of EMR which can pass through the atmosphere as a function of wavelength (revealing the atmospheric windows), and the gases responsible for absorption are noted. Microwave radiations are unaffected by atmospheric conditions, which makes them very useful, especially in cloudy areas such as the tropics.

Figure 4.3 Percentage of EMR Able to Pass Through the Atmosphere as a Function of Wavelength (after Sabins, 1978)

Figure 4.3

4.3.3 Ground Influences

The element of the earth's surface which is in the field of view of the sensor (the target or object)will produce, by reflection or emission, the EMR measured by the RS sensor. The amount of energy transmitted or reflected depends on what the target consists of and the thickness of it. A target also absorbs radiation and this can affect the temperature of the target and thus the amount of energy radiated per second. So all targets (or objects) in the environment emit and reflect different intensities and types of EMR in different portions of the spectrum, i.e. they have a so-called spectral signature which is predictable and repeatable. Figure 4.4 gives the spectral signature of various natural features. These signature curves are dependent on a number of interactions between incoming radiation and the micro and macro-structure of the matter irradiated. Spectral signatures may vary temporally, e.g. as plants grow, or spatially with different types of vegetation, different soil conditions, water availability, effect of topography, etc.

Figure 4.4 Spectral Signature of Various Natural Surface Features

Figure 4.4

4.4 Remote Sensors

Sensors are the devices used to gather EMR. They will typically consist of four components, i.e. collectors, detectors, signal processors and recording units. There are several ways of classifying sensors-we will describe them under the headings:

  1. Framing systems. These include various types of camera which record instantaneously an entire image.

  2. Scanning Systems. These employ a detector (electronic sensor) which sweeps across a scene in a series of parallel lines collecting data in order to record an image. They may employ passive sensors, which record reflected or emitted EMR from natural sources, or active sensors which illuminate an object with their own radiation source, and then record the “echo”.

In this section we will be concerned with the sensors theoretically-in Section 4.5. we will consider the actual sensors carried by operational satellites.

4.4.1 Framing Sensor Systems

Cameras may be used from various platforms. We will concentrate here on points significant to satellite photography since airborne photography is well documented in sister FAO publications (Butler et al, 1987 and Dainelli, 1988).

The still photography camera is the best known, simplest and cheapest of all sensors, and still photography produces images having a better resolution than those produced by electronic sensors. Cameras may produce simple, single images in one spectral band which are suitable for many referencing purposes, or they can produce overlapping pairs of aerial photographs which, when viewed using a stereoscope, give a three dimensional perspective of the landscape. But their chief use is in multi-spectral photography. Here a number of cameras may take simultaneous images of an object, using several band-pass filters, which each allow EMR information relative to particular wave bands to be recorded. The number, position and width of suitable colour filters can be optimized in a problem oriented manner so that the controller is able to discriminate between a wide range of features (within the visible and infrared bands - in the wavelengths between 0.4 and 1.3 um). The quality of the photographic image will depend on the inter-related factors of: focal length; the angle of view; scale of the photograph; the contrast ratio; the picture resolution and the film speed (Butler et al, 1988).

There are a variety of cameras suitable for satellite and/or aerial RS, and their selection depends upon the nature of the application. In principle mapping frame aerial cameras are similar to normal cameras except that they have:

  1. Calibrated lenses.
  2. High geometric accuracy.
  3. A medium to large format.
  4. A more complex mechanical and electrical configuration.

Though cameras have the advantages already noted, they do have some disadvantages, i.e. there is a loss of resolution during photo-chemical processing or copying, or in analogue digital conversion for subsequent computer processing, and they have a limited spectral sensitivity. They can also only function in favourable weather conditions.

4.4.2 Framing System Films

Cameras will require different types of film for different purposes. The main types, according to their range of spectral sensitivity, are:

  1. Orthochromatic. These have a very good discrimination in the green bands and are used mainly for cartographic reproduction.

  2. Panchromatic. These cover the whole of the visible spectrum, with good sensitivity, except for green bands (0.5 um) which can be compensated for with a filter. This film is inexpensive, is easy to process, has a high spatial resolution and special filters can be used to enhance selected objects (targets).

  3. Black and White Infrared. This is similar to panchromatic except that its greater spectral sensitivity means that near infrared wavelengths can be recorded in addition to visible light. Usually a dark red filter is used to screen out the visible portion of the spectrum, so that only the near infrared portion is recoreded, which results in a greater penetration of the atmosphere. This film is used mainly for detecting different vegetation stages and types, plus the existence of water.

  4. Natural Colour. The spectral range of this film is similar to that of panchromatic. It is composed of three layers, each sensitive respectively to the three primary colours-blue, green and red. Colour images offer a range of about 20 000 natural shades whilst black and White is limited to only 200 grey tone shades, i.e. Colour film allows the distinguishing of many more features, because of its greater sensitivity to tints and shades. Colour film has a number of applications, e.g. its sensitivity to sub-surface water makes it especially useful for coastline definition and the estimation of water depth and sediment content. However, colour photographs are more expensive, they have a less good image definition and they cannot be taken from a high altitude.

  5. False Colour. These films are formed by having three layers sensitive to green, red and near infrared radiations, respectively modulated into blue, green and red. The film has been moderated to achieve several advantages, e.g. a high penetration of the atmosphere, sharp resolution and definition of water bodies and a good response to the infrared reflectivity of healthy vegetation. However, these films have a limited exposure tolerance and the film requires refrigerated storage.

4.4.3 Scanning Sensor Systems

This is the main alternative to photographic systems for detecting and recording EMR. Several bands of the EMR spectrum, either from the ultraviolet to infrared regions (multispectral scanners) or the microwave bands (radiometers), may be scanned simultaneously by optically splitting the collected radiation and diverting each part to a separate detector element. Final image products can be photographs or computer compatible tapes containing digital data. Scanning sensors can be either passive or active. Passive sensors detect natural incoming EMR and active systems detect system-generated EMR (so called “echo”). Passive sensors

These sensors are called radiometers and they can detect EMR within the ultraviolet to microwave wavelengths. Two important spatial characteristics of passive sensors are:

  1. Their “instantaneous field of view” (IFOV) - this is the angle over which the detector is sensitive to radiation. This will control the picture element (pixel) size which gives the ground (spatial) resolution of the ultimate image (Figure 4.5), i.e. the spatial resolution is a function of the detector angle and the height of the sensor above the ground. For more details on spatial, spectral, radiometric and temporal resolutions see Lechi (1988).

    Figure 4.5 The Concept of IFOV and AFOV (after Avery and Berlin, 1985)

    Figure 4.5
  2. The “swath width” - this is the linear ground distance over which the scanner is tracking (at right angles to the line of flight). It is determined by the angular field of view (AFOV - or scanning angle) of the scanner. The greater the scanning angle, the greater the swath width (Figure 4.5).

There are two main categories of passive sensor:

  1. A mechanical scanning radiometer. This is an electro-optical imaging system on which an oscillating or rotating mirror directs the incoming radiation onto a detector as a series of scan-lines perpendicular to the line of flight (Figure 4.6). The collected energy on the detector is converted into an electrical signal. This signal is then recorded in a suitably coded digital format, together with additional data for radiometric and geometric calibration and correction, directly on magnetic tape on board the sensor platform.

    Figure 4.6 Optical Mechanical Scanning System

    Figure 4.6
  2. A push broom radiometer. This uses a wide angle optical system in which all the scenes across the AFOV are imaged on a detector array at one time, i.e. there is no mechanical movement (Figure 4.7). As the sensor moves along the flight line, successive lines are imaged by the sensor and sampled by a multiflexer for transmission. The push broom system is generally better than the mechanical scanner since there is less noise in the signal, there are no moving parts and it has a high geometrical accuracy.

Figure 4.7 Characteristics of a Push Broom Radiometer (after Avery and Berlin, 1985)

Figure 4.7 Active sensors

All active sensors illuminate objects with their own source of radiation. The illumination will either induce an object to emit radiation or cause it to reflect the sensor produced radiation. This ability means that no sunlight is required so imagery can be recorded by day or night or through clouds and light rain. Some active sensor systems are surface-based, e.g. sonar, others could be carried in aircraft, e.g. Side Looking Airborne Radar (SLAR) whilst others can be mounted in satellites, e.g. Synthetic Aperture Radar (SAR). We will review briefly airborne and satellite active systems, which are commonly called Radar, and which are generally classified either imaging or non-imaging:

  1. Imaging Radars. These display the radar backscatter characteristics of the earth's surface in the form of a strip map or a picture of a selected area. A type used in aircraft is the SLAR whose sensor scans an area not directly below the aircraft, but at an angle to the vertical, i.e. it looks sideways to record the relative intensity of the reflections so as to produce an image of a narrow strip of terrain. Sequential strips are recorded as the aircraft moves forward allowing a complete image to be built up (Figure 4.8). The SLAR is unsuitable for satellites since, to achieve a useful spatial resolution, it would require a very large antenna. A variant used in satellites is the SAR whose short antenna gives the effect of being several hundred times longer by recording and processing modified data.

    Figure 4.8 The Synthetic Aperture Radar System (after Avery and Berlin, 1985)

    Figure 4.8
  2. Non-imaging Radar. These are also called scatterometers since they measure the scattering properties of the region or object being observed, i.e. the roughness of the surface over a wide swath on either side of the spacecraft. A type of scatterometer is the radar altimeter which can provide an accurate height assessment for satellites, and these measurements can yield valuable topographic or sea surface roughness variations.

A further type of active sensor is the laser radar (LIDAR). LIDARs use lasers to generate short, high power light pulses. These can be used to measure the intensity of light back-scattered by the target as a function of the distance from the sensor. Because of their size LIDARs are presently limited to airborne craft.

4.5 Remote Sensing Platforms and Sensors Being Carried

In this section there will be no need to detail all the various platforms and their sensors since this has been exhaustively studied elsewhere, e.g. with regard to fisheries applications see Cheney and Rabanal (1984), Butler et al (1988) and Petterson (1989). Here it will be appropriate to mention some of the platforms commonly used, to explain the two general types of environmental satellite system, to summarize the operational systems presently in use and to exemplify the main sensor types being carried. Since it has received little attention elsewhere we will briefly mention the scope and availability of RS imagery from the U.S.S.R.

4.5.1 Sensor Platforms

Sensors can be carried on space, air, terrestrial or water-borne platforms but it is beyond our remit to examine the latter two types. There are various major airborne or space platforms as follows:

  1. Balloons. These may be free floating or anchored, and the former can be gas filled, hot air or propelled. They are now infrequently used, because they are slow, although there has been some discussion on bringing back into service the dirigible balloon.

  2. Helicopters. They may be useful for the essential “ground truthing” work, i.e. collecting statistical data in more remote areas to verify images obtained from higher and faster platforms. They have rarely carried sensors directly.

  3. Space Shuttles or Laboratories. These are essentially manned space missions. They frequently carry experimental payloads which may require human testing or adjustment, or interactive participation with ground researchers. Any of the previously mentioned sensors can be utilized, and some of them, e.g. photography, with a great deal of flexibility.

  4. Airborne. There are several sub-categories of this platform, according to flight altitude:

    1. High altitude aircraft - these usually operate at over 8 000 meters allowing for photography at about a 1:100 000 scale, or the use of multi-spectral scanners and Radar systems.
    2. Medium altitude aircraft - which operate at 3 000 to 8 000 meters and can take photographs at a 1:20 000 to 1:80 000 scale or carry multi-spectral scanners.
    3. Light aircraft - which fly below 3 000 meters and can do aerial reconnaissance, take large-scale photographs and can supplement missing or uncertain satellite imagery.

  5. Satellites. Since these are the platforms which provide the bulk of the remotely sensed data, and they are likely to be of increasing importance in the future, we shall examine them in more detail in the next section. There have now been many hundreds of satellite launches, mostly by the U.S.A. and U.S.S.R., and the majority have been for military purposes. Recently several other countries have launched their own satellites, either independently or in joint ventures. Initially, satellites were largely experimental, but they are now increasingly research or operational platforms, with most of them carrying a varied sensor payload. Their primary capability is to carry sensors which can monitor the entire earth surface on a periodic basis, sensing a large area during each revolution. They all operate at an altitude which is sufficient to escape from the earth's atmospheric drag but still remain within the dominant gravitational field, i.e. between 150 kms and 40 000 kms. Most satellites have been launched from rockets, whilst some have been launched aboard a shuttle spacecraft, from where they are unloaded into space. It has been recently demonstrated that the in-flight repair of satellite systems is now viable, though this is unlikely to be cost-effective for unmanned craft.

The advantages of satellites include the repetitive coverage of the earth's surface at various scales and at varying resolutions, with data being acquired on a routine and cost-effective basis. Often satellite-sensed data is the only information available for large tracts of ocean, mountain, desert or tropical forest areas. The disadvantages include their large capital costs, which include permanent monitoring and receiving stations, their relatively poor resolution for many environmental purposes and the fact that cloud cover remains a problem for many sensing devices.

4.5.2 Types of Environmental Satellites

It is convenient to classify environmental RS satellites into two major types, i.e. geostationary and near-polar orbiting. Geostationary satellites

These are satellite sensing systems which are boosted into a high geosynchronous orbit at approximately 35 900 kms above the equator, i.e. at this altitude the speed of the satellite can exactly match the speed of the earth's rotation. Because of this height, they have a limited number of uses, e.g. to transmit telecommunication signals or to get a broad view of the weather. The fact of remaining stationary means that they can achieve a high temporal resolution, but the great height means that spatial resolution is normally only in the range of 2 to 5 kms, according to wavelength. This type of satellite was first launched in 1966 and there are currently five geostationary satellites which each cover a different portion of the earth (Figure 4.9). They can image the earth's surface between latitudes 80N and 80S and they are able to image and transmit data on their whole viewable area every 30 minutes.

Figure 4.9 Positions and Names of the Five Geosynchronous Satellites Providing Meteorological Data (after Richards, 1986)

Figure 4.9 Near-polar orbiting satellites

These satellites orbit the earth, with an inclination relative to the equator of nearly 90 degrees, i.e. so that their orbit nearly crosses the north and south poles (Figure 4.10). Their orbit height varies between about 270 kms and 1 600 kms and it is usually sun-synchronous - meaning that it crosses the equator at the same sun time each day. By having this type of orbit, the satellite visits any particular point above the earth at the same time, which is useful for the comparative analysis of multi-temporal data. One complete revolution of the earth takes about 95 to 115 minutes (depending on altitude), meaning that 12 to 16 revolutions are achieved each day. The exact inclination of the flight path will determine the time period between re-visits to any specific location, but it is commonly once every 16 to 20 days (Figure 4.11). These satellites have a working life expectancy of about four years.

Figure 4.10 A Typical Orbital Track for a Polar Orbiting Satellite (after Taranik, 1978)

Figure 4.10

Figure 4.11 Typical Orbital Track for Each Orbit and for a Repeat Visit (after Taranik, 1978)

Figure 4.11

4.5.3 The Major Operational Environmental Satellites

A number of environmental RS satellite systems and/or programmes have been in operation since the mid-1960s. Some of these programmes are ongoing, others have ceased. Table 4.2 attempts to list the main programmes. We mention many programmes briefly since a large amount of data has been acquired from them, much of which may still be valid and available. We shall describe here the main characteristics of four satellite systems, i.e. Landsat 4 to 5, SPOT 1, ERS-1, MOS-1 and Kosmos - their sensors will be described in the next section. These are selected because they are either currently providing data or because they are the most recent of the orbiting satellite series. It is difficult to select satellite systems for study which are particularly relevant to aquaculture and inland fisheries, because much of the environmental data gained from any of the systems is potentially useful. Section 4.7 will be concerned with the potential applications of satellite data to location analysis for fish production sites. Landsat 4 and 5

Landsat 4 and 5 were launched respectively in July, 1982 and March, 1984. They both have an angle of inclination of 98.3 degrees and an orbital time of 98.5 minutes. The satellites make 14to 15 revolutions per day and it takes 16 days before a revisit track is made. The Landsat distance between successive orbits. These satellites are a continuation of the original Earth Resources Technology Satellites (ERTS) programme, initiated in 1972, which later developed swathing pattern is illustrated in Figure 4.12, which also shows the width of swath and the into the Landsat series. Landsat 4 and 5 differ from the earlier Landsats by the introduction of the Thematic Mapper sensor and the exclusion of the Return-Beam Vidicom (Figure 4.13).

Table 4.2 The Main Environmental Polar Orbiting Satellite Systems or Programmes
SatelliteCountryYear ofOperationalSensors
Programme 1st LaunchStateCarried
Tiros/NOAAU.S.A.1970–1976ceasedAVHRR; AVCS.
1st series.    
Landsat 1, 2, 3U.S.A.1972–1978ceasedMSS; RBV.
MeteorU.S.S.R.1977activeMSS; MRTVK.
2nd series.    
Nimbus-7U.S.A.1978ceasedCZCS; SMMR
Seasat-AU.S.A.1978ceasedSMMR; RA;
Landsat 4,5.U.S.A.1982–1982–1984activeMSS; TM.
KosmosU.S.S.R.1983activeSLAR; MRIR;
    MRTVK. plus
SPOT-1, 2.France1986activeHRV.
MOS-1a, bJapan1987–1990activeMESSR; MSR;
ERS-1European1990plannedAMI; SAR;
 Space  ATSR-M; RA;
 Agency  Scatterometer.
Landsat 6.U.S.A.1992plannedETM:

AMIActive microwave instrument.
ATSR-MAlong-Track scanning radiometer.
A VCSAdvanced vidicom camera.
A VHRRAdvanced very high resolution radiometer.
CZCSCoastal zone colour scanner.
ETMEnhanced thematic mapper.
HCMRHeat capacity mapping radiometer.
HRVHigh resolution visible instrument.
LIMBSProfile temperature radiometer.
LISSLinear imaging self scanning sensor.
MESSRMultispectral electronic self-scanning radiometer.
MRIRMedium resolution infrared radiometer.
MRTVKMultispectral television system.
MSRMicrowave scanning radiometer.
MSSMultispectral scanner.
RARadar altimeter.
RBVReturn beam vidicom system.
SARSynthetic aperture radar.
SASSSeasat satellite scatterometer.
SLARSide looking airborne radar.
SMMRScanning multichannel microwave radiometer.
TMThematic mapper.
VIRRVisible and infrared radiometer.
VTIRVisible and thermal infrared radiometer.

Figure 4.12 The Landsat Swathing Pattern and Successive Orbit Paths (after Taranik, 1978)

Figure 4.12

Figure 4.13 Configuration of Landsats 4 and 5

Figure 4.13

There are some 18 receiving stations around the world from where Landsat data is transmitted to NASA's Goddard Spaceflight Center. It is then passed to a commercial company (EOSAT) for processing and distribution. Both Landsat 4 and 5 have well outlived their life expectancy yet continue to operate because some of the on-board systems have been closed down to save power. There is likely to be a “data gap” when these two satellites finally cease to transmit, i.e. since lack of funding and technical problems will prevent Landsat 6 from being launched until the second quarter of 1992. SPOT - 1

SPOT-1 (Système pour l'Observation de la Terre) was launched in February, 1986. It was constructed by the French in cooperation with Belgium and Sweden. It has a sun-synchronous orbit at an inclination of 98.7 degrees, and a revolution period of 101.4 minutes. It makes 14 or 15 revolutions per day and revisits the same track every 26 days. It's altitude varies from 820 to 840 kms and it can acquire images between 84 degrees North and South. A key feature of SPOT is the provision for off-nadir viewing, i.e. it can “look” side ways for up to 27 degrees from the vertical in either direction, extending the field of view by 475 kms each way. This allows for a much reduced revisit time, although images would then necessarily be from an oblique angle. This off-nadir facility is steerable from ground control (Figure 4.14). Off-nadir viewing also allows for stereoscopic viewing - pairs of images of a given scene can be recorded at different viewing angles during successive satellite passes in the vicinity of the scene concerned (Figure 4.15).

Figure 4.14 SPOT Off-Nadir Revisit Capabilities (from SPOT Image Newsletter, 1986)

Figure 4.14

Figure 4.15 Stereoscopic Viewing Capabilities (from SPOT Image Newsletter, 1986)

Figure 4.15

European data from SPOT is received at stations in Toulouse (France) and Kiruna (Sweden). On board recording of data is possible in areas beyond the range of ground receiving stations, for later transmission to Toulouse. Data can be received at stations over a maximum distance of 2 600 kms, and this means that transmissions can last for up to 800 seconds whilst passing a station. Stations are capable of receiving about 250 000 scenes a year. Each day, an observation sequence is loaded into the on-board computer from the Toulouse ground control station. Dissemination of SPOT data is via a commercial company “SPOT Image”. SPOT-1 will be decommissioned in September, 1990. SPOT-2, with similar credentials to SPOT-1, was launched in January 1990 and two further satellites in the series are planned before 1998. Marine Observation Satellite (MOS-1)

This was the first Japanese earth observation satellite - it was launched in February, 1987 and has a three-year scheduled life (Figure 4.16). It orbits at an altitude of 908.7 kms, having an inclination of 99.1 degrees and it makes 14 orbits per day. There is repeat coverage every 17 days and it takes 237 orbits to gain a total world wide coverage. MOS-1 does not carry tape recorders and thus ground stations are needed to acquire data which are out of range of the Japanese Earth Observation Centre. The European Space Agency has an agreement with the Japanese Space Development Agency to acquire, process and distribute MOS-1 products in Europe. MOS-1b, with identical credentials to MOS-1, was launched in February, 1990 and a further four satellites in the series are planned.

Figure 4.16 The Configuration of MOS-1

Figure 4.16 ESA Remote Sensing Satellite (ERS-1)

ERS-1 is to be the first of a series of satellites, in a programme to be operational in the 1990s, which have been planned by a consortium of countries making up the European Space Agency (ESA). Following its launch in 1991 ERS-1 will be placed in a sun-synchronous orbit and will give global coverage, including the polar regions. It will have an altitude of 777 kms, an inclination of 98.5 degrees and a revolution time of 100 minutes (Figure 4.17). ERS-2 is planned for launch in 1994. Real-time data will be relayed to stations at West Freugh (Scotland), Kiruna (Sweden), Fucino (Italy), Maspalomas (Canary Islands) and Prince Albert (Canada). Kosmos

This Russian satellite series is a continuation of an older series, and it includes satellites launched for a variety of purposes. It is managed by the “PRIRODA” State Remote Sensing Centre. Soviet officials note that launches in the series occur once every two or three months. The satellites are placed in low (270 kms) orbits and the repeat time for full coverage is 22 days. Their life expectancy varies but it is very short. There is an absence of a network of ground receiving stations.

4.5.4 The Major Environmental Remote Sensors

Here we will be concerned to look first at the main sensors being carried by Landsats 4 and 5, SPOT 1, ERS-1, MOS-1 and Kosmos, including their usual applications. We will then describe two other sensors which have been particularly useful for marine or water based purposes.

Figure 4.17 The Configuration of ERS-1

Figure 4.17 Landsat 4 and 5 sensors

Both Landsats carry similar sensors and there are two main types:

  1. Multi-Spectral Scanner. This system is a line scanning device which records from four bands, two in the visible and two in the near infrared spectral bands. The system is comprised of a telescope, a mirror which reflects ground radiation onto a bank of 24 electro-optical sensors, band filters and a sampling system, an internal calibration system and various devices which ensure an orderly stream of digital data for each pixel and spectral band. It images six scan lines in each of the four spectral bands simultaneously giving a 24 scan-line total. Resolution, as delimited by pixel size, is 83 m and a six-bit quantization gives a possible devices which ensure an orderly stream of digital data for each pixel and spectral band. It images six scan lines in each of the four spectral bands simultaneously giving a 24 scan-line total. Resolution, as delimited by pixel size, is 83 m and a six-bit quantization gives a possible range of 64 intensity values. The individually scanned scenes of one MSS image covers approximately 185 x 185 kms and each overlaps its neighbour by about 10%. The original images have a scale of 1:3 369 000 and one frame encompasses 34 000 km2. Data is recorded on magnetic tape for later transmission to receiving stations and data is available in digital or analogue (photographic) form. Table 4.3 shows details of the bands and possible applications of the imagery.

    Table 4.3 Landsat MSS Bands and Applications
    BandSpectral RangeFeatures/Applications
    4500–600 nm Green bandImagery from this band emphasizes movement of sediment laden water and shallow water bodies, shoals and reefs, etc.
    5600–700 nm Red bandImagery from this band emphasizes cultural features, e.g. urban areas and roads, and sometimes bare soil surface colours.
    6700–800 nm Near infraredThis type of imagery emphasizes vegetation and landforms in general.
    7800–1100 nm Second near infrared bandThis imagery provides best penetration of atmospheric haze, and it emphasizes vegetation and land-water boundaries.

  2. Thematic Mapper. This sensor collects, filters and detects radiation in a similar 185 km swath. It records in seven spectral bands which include medium and thermal infrared. It provides a spatial resolution of 30 meters, except on the thermal infrared band, where it is 120 meters. The high spectral resolution is achieved by sensitive detectors and an 8-bit quantization in the analog-to-digital conversion process gives 256 grey levels. The Enhanced Thematic Mapper (ETM), to be launched on Landsat 6, will have an additional 15 meter resolution panchromatic band. Table 4.4 shows the potential application of this sensor by band and wavelength. SPOT-1 sensors

SPOT carries two identical high resolution visible (HRV) scanners, each of which can function independently. They can each scan a strip measuring 60 x 80 kms along the flight line, this width varying with the viewing angle. Every 60 kms the SPOT data is cut to form a scene. The two sensors are designed to operate in either of two modes - panchromatic (black and white) or multi-spectral (colour) in the visible and near infrared spectral bands. The sensors are of the push-broom type. Each consists of a series of fixed linear arrays made up of electronic detectors known as CCDs (charged-coupled devices). Image data are collected by successively measuring the current generated by each detector along the array. In panchromatic mode each individual detector corresponds to one pixel and measures the reflectance of a 10 meter ground resolution cell. In multispectral mode the detectors are paired and thus measure a 20 meter pixel across the track. By doubling the time taken to obtain each sample, the along track measurement of the cell also becomes 20 meters.

Table 4.4 Bands and Applications of the Landsat Thematic Mapper
BandSpectral Range (micrometers)ResolutionFeatures/Applications
TM 10.45–0.52
visible blue- green
30 mBathymetry in less turbid waters, soil/vegetation differences, deciduous/coniferous differentiation, soil types.
TM 20.52–0.60
visible green
30 mIndicator of growth rate/vegetation vigour, sedimentation concentration estimates, turbid water bathymetry.
visible red
30 mChlorophyll absorption/species differentiation, crop classification, tion, vegetation cover and density, geological applications.
TM 40.76–0.90
solar near infrared
30 mWater body delineation, biomass and stress variations.
TM 51.55–1.75
solar mid infrared
30 mVegetation moisture/stress, minerals.
TM 610.4–12.5
emitted thermal
120 mSurface apparent temperatures, urban versus land use separation, distinguishing burned areas from water bodies.
TM 72.08–2.35
solar mid infrared
30 mHydrothermally altered zones, mineral exploration, soil type discrimination.

We have already mentioned the nadir, off-nadir and stereoscopic viewing capabilities of the system. The main applications of the stereoscopic imagery are in photogrammetry, for cartographic purposes and photo-interpretation for geological, geomorphological and hydrological studies. SPOT's other application are largely for land-use studies, the assessment of renewable resources and aiding in mineral and oil exploration. The high resolution makes possible the compilation of topographic maps (at scales of 1:100 000), with the contour interval as little as 20 meters, thematic mapping of between 1:25 00 and 1:50 000 plus the direct compilation of digital terrain models. Table 4.5 gives the main characteristics of the HRV.

Table 4.5 Characteristics of the HRVs on SPOT-1
Characteristics of the HRV instrumentMultispectral modePanchromatic mode
3 Spectral bands:green0.50–0.59μm 
1 Broad spectral band: 0.50–0.75μm
Instrument field of view4.13°4.13°
Ground sampling interval (nadir viewing)20m×20m10m×10m
Number of pixels per line30006000
Ground swath width (nadir viewing)60km60km
Pixel coding format3×8 bits6 bits DPCM (1)
Image data bit rate25M bits/s25 M bits/s

(1) DPCM (Digital Pulse Code Modulation) is a mode of data compressionthat does not degrade the radiometric accuracy of the image data(256 grey levels). ERS-1 sensors

The main sensors on board will be:

  1. Active Microwave Instrument(AMI)combining the functions of a Synthetic Aperture Radar (SAR), a Wave Scatterometer and a Wind Scatterometer. The AMI will measure wind fields and wave spectra and obtain all-weather images. It will provide for a spatial resolution of 30 meters and have a swath of 99 kms.

  2. Radar Altimeter (RA) to measure significant wave height and provide measurements over ice and major oceanic currents.

  3. Along-Track Scanning Radiometer (ATSR-M) to determine sea surface temperatures and measure atmospheric water vapour.

  4. Precise Range and Range Rate Experiment (PRARE) for accurate satellite ranging and to allow ionospheric error correction.The programme objectives include the global observation of: waves; sea state ;ocean currents; ocean waves; sea surface temperatures; sea ice; ice sheet dynamics, and also to provide imaging of land by SAR. These objectives will help with shipping, fisheries and offshore activities. MOS-1 sensors

The details of the three sensors on board MOS-1a and 1b are given in Table 4.6. MOS is intended to establish fundamental technologies for earth observation satellites, primarily by observing oceanic phenomena such as ocean colour and temperature. The satellite observations are also expected to be of value to agriculture, forestry, fishery and environmental preservation.

Table 4.6 Bands, Features and Objectives of MOS-1 Sensors
Objectivesea surface colour, vegetation, land use, etc.suspended sedimentstratospheric water vapour,Earth and sea surface temperatureswater vapour content, liquid water content, ice,snowetc.
Observation0.61–0.69 6–7 
 0.80–1.10 11.5–12.5 
Frequency (GHz)   23.8±0.2           31.4±0.25
Beam Width   1.89±0.19           1.31±0.13
Integration Time (m sec)   10 & 47           10 & 47
IFOV(km)                23
Swath Width(km)100(each optic)1500317 Kosmos sensors

Satellites in the series have carried different sensors and combinations of sensors. Of most interest for environmental observations are their cameras, details of which are shown in Table 4.7. The KFA-1000 provides for a ground resolution as small as 5 meters, giving it a great advantage over SPOT or Landsat. Another advantage is the frequency of cover. 98% of all PRIRODA's RS imagery is obtained from a combination of the three cameras carried on Kosmos satellites. After imaging in space, the exposed film is soft-landed to earth for processing.

Table 4.7 Specification of Cameras Carried by Kosmos Satellites (from Morrison and Bond, 1989)
Type of camera Type of surveyAverage orbit height (km)Average image scaleNumber of spectral zonesSpectral range (nm)Image format (cm)Area of coverage (km) Endlap of images (%)Ground resolution (m)
Band-specific, colora   680–810 (6400)  
    (2)810–900 (13,700)  
Multispecial and color   (3)515–565    
band-specific   (4)460–505 216×216  
    (5)580–800 (46,700)  
    600–700 (59,000)  
Multispectral   700–850    

a The survey is performed by two cameras, each of which surveys a swath of territory to the left or right of the axial line of movement. As a result imagingby each camera occurs with a deviation of 8° from the vertical.

b The MK-4 camera; a has 4 lenses, which survey one and the same area simultaneously. Black-and-white imaging is performed in 3 spectral bands (from 6possible alternatives) depending on the films and filters used and color band-specific imaging is done on two-layer film in one spectral band (from 2possible alternatives). Other sensors of interest to aquaculture and inland fisheries

There have been a number of other satellite sensors launched in the past two decades which have been of potential value to aquaculture and inland fisheries. We will briefly describe two of these:

  1. The Heat Capacity Mapping Radiometer (HCMR). This sensor was launched as part of the Heat Capacity Mapping Mission in April 1978 and operated until September 1980. The sensor was a two channel scanning radiometer operating in the visible and near infrared band and the thermal infrared band. The main objectives of interest were:

    1. The mapping of natural and man-made thermal effluents.
    2. The detection of thermal gradients in water bodies.
    3. The mapping and monitoring of snow fields for water run-off prediction.
    4. The monitoring of marine oil pollution.

    Many of the products obtained from this mission are still available.

  2. The Coastal Zone Colour Scanner (CZCS). This was launched aboard Nimbus-7 in October 1978 and was operational until late 1984. The CZCS was a multi-spectral line scanner, optimized for use over water. It collected quantitative information on ocean colour, suspended sediments, chlorophyll concentrations, pollutants and temperature from the upper few meters of water. Both photographic and digital data are still available. Table 4.8 sets out the bands and measurements of CZCS.

Table 4.8 Bands and Measurements of the CZCS Sensor
Coastal Zone Colour Scanner(CZCS)
 Wavelength (pm)Spatial ResolutionSwath WidthMeasurments
Band10.43–0.45800m1800kmchlorophyll absorption
Band20.51–0.53800m1800kmchlorophyll distribution
Band30.54–0.56800m1800kmgelbstoffe(yellow substance)
Band40.66–0.68800m1800kmchlorophyll concentration
Band50.70–0.80800m1800kmsurface vegetation
Band610.50–12.50800km1800surface temperature, diffuse attention co-efficient

4.6 Data Processing of Remotely Sensed Imagery

The electronic images which have been captured by RS devices are either transmitted directly to earth or are stored on on-board recorders for later transmission. This represents the initial stage in a complex information flow which is depicted in Estes (1985) (Figure 4.18). The data scanned are retained in the form of pixel values, with each value representing the amount of radiation(the spectral reflectance) within a given band-width received by the scanner from the area of the earth's surface covered by the pixel. Pixel values are digitally coded by a certain number of bits, i.e. Landsat and SPOT use 8-bit codings which give a range of 256 possible values, and the values for any one pixel will change according to particular spectral bands being recorded. The area covered by a pixel (the resolution) is a function of the height of the sensor, the focal length of the lens or focusing system, the wavelength of the radiation and other inherent characteristics of the sensor itself. Each pixel will be allocated a co-ordinate in agrid referencing system.

Figure 4.18 The Flow of Information in the Remote Sensing System

Figure 4.18

The pixel values are transmitted to earth (downlinked) as a stream of binary numbers. To reconstitute the images, ground based computers decode the binary data, allocating the appropriate colour tone to each pixel value. The images can then be displayed on a monitor or in some print-out form. At the initial stage they will be monochrome and in a pre-processed state. To perform image analysis processes there is a vast selection of computer hardware and software systems, for micro, mini or mainframe computers, which we cannot explore here but which Jensen (1986) reviews in some detail. These software systems should be capable of executing all, or a number of, specific processing functions, as shown in Table 4.9. Not all of the functions shown are essential, i.e. this will depend on the type of output required. We will describe here only the essential functions plus those more commonly used. Images from Landsat and SPOT can be purchased at various “levels” of processing.

Table 4.9 Image Processing Functions Found in Many Image Processing Systems (from Jensen, 1986)
 A.Radiometric correction (for system and environmental effects)
 B.Geometric correction (image to map or image to image)
Display and enhancement
 C.Black-and-white display
 D.Color composite display
 E.Density slice
 F.Magnification or reduction
 H.Contrast stretch
 I.Image algebra (band ratioing, differencing, etc.)
 J.Spatial filtering
 K.Edge enhancement
 L.Principal components
 M.Linear combinations (e.g., Kauth transform)
 N.Texture transforms
 O.Fourier transforms
Information extreaction
 P.Supervised classification
 Q.Unsupervised classification
 R.Contextual classification
 S.Incorporation of ancillary data in the classification
Geographic information systems (GIS).
 T.Raster- or image-based GIS
 U.Vector- or polygon-based GIS (must allow polygon comparison)
Integrated systems
 W.Complete image processing systems (functions A to S)
 Y.Complete image processing systems and GIS (functions A to S and
  T or U)
 Z.Mainframe communication for micro- or minicomputer-based systems

4.6.1 Image Pre-Processing

The two functions involved here, radiometric and geometric corrections, will be essential if meaningful output is to be obtained. They are essential because there are a number of factors inherent in the RS system which contribute to the images being distorted in some way. These factors include:

  1. Changes in the attitude, velocity and altitude of the sensing platform.
  2. The forward motion of the platform causes scan skew.
  3. The scanners (in Landsat) do not have a constant scan velocity.
  4. The area covered by one pixel will have its shape distorted when viewing at an oblique angle.
  5. The geometry of the images is affected by the earth's rotation, its curvature and atmospheric refraction.
  6. Radiometry is affected by the sensor, e.g. sensor “noise” and poor calibration between detectors, by the atmosphere, e.g. presence of aerosols and scattering effect, and by the scene itself, e.g. effect of relief on reflection and type of reflection of the object.
  1. Radiometric Correction. Detector sensitivity will slowly change over time, making some detectors more or less sensitive to radiance than its neighbours. This results in images which have a “banding” or striped pattern which needs correcting. This effect can occur in both mechanical or push broom scanners, as can “pixel drop” when individual pixel radiance is not recorded. Radiometric distortions (atmospheric attenuation) is also a problem as radiance is altered by the atmosphere through which it passes. This is especially a problem over water when there is a lot of atmospheric water vapour, i.e. radiance reaching the detector may be 20% from the water, 80% from the atmosphere. There are various correction methods, some of which are described in Butler et al (1988).

  2. Geometric Correction. This will involve several levels of pre-processing. The data has first to be corrected for earth curvature, earth rotation and satellite attitude errors. After this the image may still contain geometric distortions, with the center of the scene located to an accuracy of only a few kilometers. To improve this, a sufficient number of ground control points, which are readily identifiable on the image and on a map, are selected for calculations of a least-square fit, and the results are then used to adjust the image to the map co-ordinates. Maps of different projections, e.g. Mercator, Peter's Conformal Lambert, etc., can be used.

Previous Page Top of Page Next Page