Skip to content

Introduction

Alexandre Castagna edited this page Mar 9, 2020 · 25 revisions

A sensor measuring the upward radiance at any given height above the surface is measuring the radiance of the surface-atmosphere system. The air medium between the surface and the sensor contributes both with a direct signal (backscattered solar radiation) and indirect signals (attenuation of the radiance propagating in a given direction and change in directionality of the light field). If the intent is to observe surface properties, the effects of the atmosphere must be removed in a process commonly called "atmospheric correction" (AC).

Most of the AC codes available compensate for the direct signal and for the indirect signal of attenuation, but do not compensate for the directionality effect of atmospheric scattering. The scheme below illustrates the problem:

The signal measured when pointing at the target (red line) is contaminated by radiance from surrounding surface area elements that were scattered by the atmosphere into the Field of View (FOV) of the sensor. This component is called "adjacency effect" or "atmospheric blurring", causing loss of contrast. For remote sensing of land surfaces the loss of contrast will deteriorate spectral classifications and visual identification and can be of relevance specially under more turbid atmospheres. For remote sensing over water "near" (< 1 km) the shore, however, the problem can be severe since water typically has much lower reflectance than land, specially for wavelengths larger than 700 nm.

It is possible to compensate for the adjacency effect if the spatial atmospheric scattering pattern depicted in the figure above can be described, which is done by means of a Point Spread Function (PSF).

Definition of a Point Spread Function

The formal definition of the Point Spread Function is an imaging sensor's response to a point source. For example, even when imaging an object in vacuum, the optical elements of the sensor assembly will cause some change in direction producing blur on an imaging system. This effect is normally evaluated during sensor calibration and we need not to further dwell on it. The same principle however is true if we substitute the optical elements for the the atmosphere as the scattering medium, where the point source is a area element at the surface. The PSF can them be understood as an array of weights describing the response of the detector elements to an area element source. This condition is illustrated in the left most figure of the panel below.

But the surface is formed by an infinite number of area elements, each one reflecting photons and, if the atmosphere is constant over those area elements, their reflected photons will be spread by the atmosphere with the same distribution function. So we can change the perspective to look a single detector element receiving photons from all area elements. The PSF now is an array of weights describing how much signal is received from each area element. This is illustrated on the center figure and simplified on the right most figure. You can check that paths in the right most image are the same as in the left most image, just aligned at their end position, instead of the start position. The important conclusion is that by symmetry, the PSF is the same, either from the perspective of the area element or from the detector element.

In practice the PSF is a convolution filter, hence our description as a array of statistical weights. The image acquired by the sensor is a convolution of the real objects with the APSF and therefore if the convolution filter is known, it is possible to attempt a deconvolution, to remove the adjacency effect. The tools provided on this package allow to simulate / reconstruct the APSF for a given atmospheric condition, observation geometry and sensor spatial resolution.

The Wikipedia entry on the PSF is a good source of information and has interesting examples.

Previous: Home Next: Simulating APSFs

Clone this wiki locally