Optical Imagery is Eclipsed!

Solar eclipse across the USA captured by Suomi NPP VIIRS satellite on 21st August. Image courtesy of NASA/ NASA’s Earth Observatory.

Last week’s eclipse gave an excellent demonstration of the sun’s role in optical remote sensing. The image to the left was acquired on the 21st August by the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the NOAA/NASA Suomi NPP satellite, and the moon’s shadow can be clearly seen in the centre of the image.

Optical remote sensing images are the type most familiar to people as they use the visible spectrum and essentially show the world in a similar way to how the human eye sees it. The system works by a sensor aboard the satellite detecting sunlight reflected off the land or water – this process of light being scattered back towards the sensor by an object is known as reflectance.

Optical instruments collect data across a variety of spectral wavebands including those beyond human vision. However, the most common form of optical image is what is known as a pseudo true-colour composite which combines the red, green and blue wavelengths to produce an image which effectively matches human vision; i.e., in these images vegetation tends to be green, water blue and buildings grey. These are also referred to as RGB images.

These images are often enhanced by adjustments to the colour pallets of each of the individual wavelengths that allow the colours to stand out more, so the vegetation is greener and the ocean bluer than in the original data captured by the satellite. The VIIRS image above is an enhanced pseudo true-colour composite and the difference between the land and the ocean is clearly visible as are the white clouds.

As we noted above, optical remote sensing works by taking the sunlight reflected from the land and water. Therefore during the eclipse the moon’s shadow means no sunlight reaches the Earth beneath, causing the circle of no reflectance (black) in the centre of the USA. This is also the reason why no optical imagery is produced at night.

This also explains why the nemesis of optical imagery is clouds! In cloudy conditions, the sunlight is reflected back to the sensor by the clouds and does not reach the land or water. In this case the satellite images simply show swirls of white!

Mosaic composite image of solar eclipse over the USA on the 21st August 2017 acquired by MODIS. .Image courtesy of NASA Earth Observatory images by Joshua Stevens and Jesse Allen, using MODIS data from the Land Atmosphere Near real-time Capability for EOS (LANCE) and EOSDIS/Rapid Response

A second eclipse image was produced from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite. Shown on the left this is a mosaic image from the 21st August, where:

  • The right third of the image shows the eastern United States at about 12:10 p.m. Eastern Time, before the eclipse had begun.
  • The middle part was captured at about 12:50 p.m. Central Time during the eclipse.
  • The left third of the image was collected at about 12:30 p.m. Pacific Time, after the eclipse had ended.

Again, the moon’s shadow is obvious from the black area on the image.

Hopefully, this gives you a bit of an insight into how optical imagery works and why you can’t get optical images at night, under cloudy conditions or during an eclipse!

4 thoughts on “Optical Imagery is Eclipsed!

  1. There must be a substantial number of images that have been affected by partial eclipse (totality is at least fairly obvious), but this might not be realised by users.

    Landsat acquisitions on the day of the eclipse were as such: https://landsat.usgs.gov/august-18-2017-solar-eclipse-and-landsat none were during totality but one was only 23 minutes away. These acquisitions would have a partial eclipse depth variable across an image which would play havoc with calibration of the photometry if you didn’t realise.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.