Uncovering Secrets with Remote Sensing

Artist's rendition of a satellite - mechanik/123RF Stock Photo

Artist’s rendition of a satellite – mechanik/123RF Stock Photo

Recent significant discoveries in Cambodia and Jordan have highlighted the potential offered by remote sensing and satellite imagery to help uncover secrets on Earth – a field known as satellite archaeology.

Cambodia
Helicopter mounted Lidar was used to reveal multiple cities beneath the forest floor near the ancient temples of Angkor Wat in Cambodia. Lidar, which stands for Light Detection and Ranging, is an active optical remote sensing technique that uses a laser scanner to map the Earth’s topography by emitting a laser pulse and then receiving the backscattered signal. In Cambodia, a topographic Lidar with a near infrared laser was used by Australian archaeologist Dr Damian Evans to survey beneath the forest vegetation.

The conurbations discovered, surrounding the stone temple Preah Khan Kompong Svay, are believed to be between 900 to 1 400 years old. Analysis of the survey has shown a large number of homes packed together liked terraced houses, together with structures for managing water and geometric patterns formed from earth embankments – which could be gardens.

At 734 square miles, the 2015 survey is also thought to be the most extensive of its type ever undertaken. Dr Evans work is due to be published in the Journal of Archaeological Science.

Jordan
Archaeologists using high resolution satellite imagery, drones surveys and imagery within Google Earth have discovered a huge structure buried in the sand less than a kilometre south of the city of Petra. The two high resolution satellites used were Worldview-1 and Worldview-2, operated by DigitalGlobe. Worldview-1 was launched in September 2007 and has a half-metre panchromatic resolution; Worldview-2, launched two years later, offers similar panchromatic resolution and 1.85m multispectral resolution.

The outline of the structure measures 56m x 49m, and there is a smaller platform contained inside the larger one. Nearby pottery finds suggest the platform is 2 150 years old, and it is thought that it had a ceremonial purpose. The research undertaken by Sarah Parcak and Christopher Tuttle was published in the May 2016 edition of the Bulletin of the American Schools of Oriental Research.

Benefits of Remote Sensing & Satellites
Angkor Wat and Petra are both World Heritage sites, and the benefits of using remote sensing and satellite technology to undertake archaeological investigations are evident in the statement from Christopher Tuttle who noted that they did not intend to excavate their Petra discovery as ‘The moment you uncover something, it starts to disintegrate.’

Satellite technology allows investigations to take place without disturbing a piece of soil or grain of sand, which is a huge benefit in terms of time, cost and preservation with archaeology. These two discoveries also demonstrate that the world still has secrets to reveal. As Sarah Parcak herself said in 2013, “We’ve only discovered a fraction of one percent of archaeological sites all over the world.”

Who knows what remote sensing and satellite imagery will uncover in the future?

How to Measure Heights From Space?

Combining two Sentinel-1A radar scans from 17 and 29 April 2015, this interferogram shows changes on the ground that occurred during the 25 April earthquake that struck Nepal. Contains Copernicus data (2015)/ESA/Norut/PPO.labs/COMET–ESA SEOM INSARAP study

Combining two Sentinel-1A radar scans from 17 and 29 April 2015, this interferogram shows changes on the ground that occurred during the 25 April earthquake that struck Nepal. Contains Copernicus data (2015)/ESA/Norut/PPO.labs/COMET–ESA SEOM INSARAP study

Accurately measuring the height of buildings, mountains or water bodies is possible from space. Active satellite sensors send out pulses of energy towards the Earth, and measure the strength and origin of the energy received back enabling them to determine of the heights of objects struck by the pulse energy on Earth.

This measurement of the time it takes an energy pulse to return to the sensor, can be used for both optical and microwave data. Optical techniques such as Lidar send out a laser pulse; however within this blog we’re going to focus on techniques using microwave energy, which operate within the Ku, C, S and Ka frequency bands.

Altimetry is a traditional technique for measuring heights. This type of technique is termed Low Resolution Mode, as it sends out a pulse of energy that return as a wide footprint on the Earth’s surface. Therefore, care needs to be taken with variable surfaces as the energy reflected back to the sensor gives measurements from different surfaces. The signal also needs to be corrected for speed of travel through the atmosphere and small changes in the orbit of the satellite, before it can be used to calculate a height to centimetre accuracy. Satellites that use this type of methodology include Jason-2, which operates at the Ku frequency, and Saral/AltiKa operating in the Ka band. Pixalytics has been working on a technique to measure river and flood water heights using this type of satellite data. This would have a wide range of applications in both remote area monitoring, early warning systems, disaster relief, and as shown in the paper ‘Challenges for GIS remain around the uncertainty and availability of data’ by Tina Thomson, offers potential for the insurance and risk industries.

A second methodology for measuring heights using microwave data is Interferometric Synthetic Aperture Radar (InSAR), which uses phase measurements from two or more successive satellite SAR images to determine the Earth’s shape and topography. It can calculate millimetre scale changes in heights and can be used to monitor natural hazards and subsidence. InSAR is useful with relatively static surfaces, such as buildings, as the successive satellite images can be accurately compared. However, where you have dynamic surfaces, such as water, the technique is much more difficult to use as the surface will have naturally changed between images. Both ESA’s Sentinel-1 and the CryoSat-2 carry instruments where this technique can be applied.

The image at the top of the blog is an interferogram using data collected by Sentinel-1 in the aftermath of the recent earthquake in Nepal. The colours on the image reflect the movement of ground between the before, and after, image; and initial investigations from scientists indicates that Mount Everest has shrunk by 2.8 cm (1 inch) following the quake; although this needs further research to confirm the height change.

From the largest mountain to the smallest changes, satellite data can help measure heights across the world.

Lidar: From space to your garage and pocket

Lidar data overlaid on an aerial photo for Pinellas Point, Tampa Bay, USA. Data courtesy of the NASA Experimental Airborne Advanced Research Lidar (EAARL), http://gulfsci.usgs.gov/tampabay/data/1_lidar/index.html

Lidar data overlaid on an aerial photo for Pinellas Point, Tampa Bay, USA. Data courtesy of the NASA Experimental Airborne Advanced Research Lidar (EAARL), http://gulfsci.usgs.gov/tampabay/data/1_lidar/index.html

Lidar isn’t a word most people use regularly, but recent developments in the field might see a future where is becomes part of everyday life.

Lidar, an acronym for LIght Detection And Ranging, was first developing in the 1960’s and is primarily a technique for measuring distance; however, other applications include atmospheric Lidar which measures clouds, particles and gases such as ozone. The system comprises of a laser, a scanner and GPS position receiving, and it works by emitting a laser pulse towards a target, and measuring the time it takes for the pulse to return.

There are two main types of Lidar used within remote sensing for measuring distance, topographic and bathymetric; topographic Lidar uses a near infrared laser to map land, while bathymetric Lidar uses water-penetrating green light to measure the seafloor. The image at the top of the blog is a bathymetric Lidar overlaying an aerial photograph Pinellas Point, Tampa Bay in the USA, showing depths below sea level in metres. Airborne terrestrial Lidar applications have also been expanded to include measuring forest structures and tree canopies mapping; whilst there’s ground based terrestrial laser scanners for mapping structures such as buildings.

As a user getting freely accessible airborne Lidar data isn’t easy, but there are some places that offer datasets including:

Spaceborne terrestrial Lidar has been limited, as it has to overcome a number of challenges:

  • It’s an active remote sensing technique, which means it requires a lot more power to run, than passive systems and for satellites this means more cost.
  • It’s an optical system that like all optical systems is affected by cloud cover and poor visibility, although interestingly it works more effectively at night, as the processing doesn’t need to account for the sun’s reflection.
  • Lidar performance decreases with inverse square of the distance between the target and the system.
  • Lidar collects individual points, rather than an image, and images are created by combining lots of individual points. Whilst multiple overflies are possible quickly in a plane, with a satellite orbiting the Earth you’re effectively collecting lines of points over a number of days, which takes time.

The only satellite that studied the Earth’s surface using Lidar is NASA’s Ice, Cloud and Land Elevation Satellite – Geoscience Laser Altimeter system (IceSAT-GLAS); launched in 2003, it was decommissioned in 2010. It measured ice sheet elevations and changes, together with cloud and aerosol height profiles, land elevation and vegetation cover, and sea ice thickness; and you find its data products here. IceSAT-GLAS 2 is scheduled for launch in 2017. The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), part of the satellite A-Train, is a joint NASA and CNES mission launched in 2006. Originally designed as an atmospheric focused Lidar, it has since developed marine applications that led to the SABOR campaign we discussed in previous blog.

Beyond remote sensing, Lidar may become part of every household in the future, if recent proof-of-concepts come to fruition. The Google self-drive car uses a Lidar as part of its navigation system to generate a 3D maps of the surrounding environment. In addition, research recently published in Optics Express, by Dr. Ali Hajimiri of California Institute of Technology has described the potential of a tiny Lidar device capable of turning mobile phones into 3D scanning devices. Using a nanophotonic coherent imager, the proof-of-concept device has put together a 3-D image of the front of a U.S. penny from half a meter away, with 15-μm depth resolution and 50-μm lateral resolution.

Lidar has many remote sensing and surveying applications, however, in the future we all could have lasers in our garage and pockets.

Smashing the Earth Observation Data Silos

Artist's rendition of a satellite - paulfleet/123RF Stock

Artist’s rendition of a satellite – paulfleet/123RF Stock

Earth observation (EO) is an all-encompassing term for monitoring our world, however as soon as you start examining the topography of the field in detail you’ll find all sorts of mountains, valleys and oceans. An illustration of the different stands can be seen if you consider the subject areas such as hydrography, geology, surveying and remote sensing, or think about areas of interest like the land and the marine specialists, and finally think about sensors specialists for LIDAR, optical or hyperspectral imaging. Historically a lot of these groupings have tended to work in relative isolation with a limited amount of interaction between them, which has created a lot of EO data, and knowledge, silos. However as satellite technology has developed, the quantity of EO data available has increased exponentially; for example, Landsat is currently collecting fourteen times as many images each day than it was in the 1980’s. Whilst many datasets have been collected, few have been brought together. This is due to both computing power required to manage large datasets and the difficulties of cross-calibrating sensors with different errors and uncertainties. Cloud computing has broken through most of the data processing obstacles, giving the potential for many more people to get involved in data manipulation, modelling and visualisation. The next challenge is to smash open these data silos, and provide access to historical archives, and new collections, to both the scientific community, and anyone else who is interested. Joining together the different strands of data and knowledge will promote innovation and help us significantly develop our understanding of the planet. Individual space agencies are working on this through making new data freely available and by analysing their own historical archives and then reprocessing them to improve consistency. Some examples include:

Progress is being made, but there are still limitations as often this only represents the bringing together of data from a single mission; a product set or thematic group. There is a need to be bolder and to amalgamate much wider datasets. Last week, Taiwan demonstrated how this could be achieved by presenting their petascale database for assessing climatic conditions, which has brought together data from the atmosphere, hydrology, ocean currents, tectonics and space. The Earth Science Observation Knowledge base holds ten and half million records and gives scientists near real time access to data. EO has a vast array of valuable data and is collecting more every day. We’re starting to smash the data silos, but we need to do more to achieve the next step change in understanding how our world works.

Looking Deeper At Phytoplankton from Space

NASA is currently in the middle of a joint airborne and sea campaign to study the ocean and atmosphere in preparation for developing instruments for future spaceborne missions. The Ship-Aircraft Bio-Optical Research (SABOR) campaign has brought together experts from a variety of disciples to focus on the issue of the polarization of light in the ocean; it runs from 17th July to 7th August and will co-ordinate ocean measurements with overflights.

One of the instruments on SABOR is an airborne Lidar-Polarimeter aimed at overcoming the limitation of vertically integrated surface measurements as captured by many existing Earth Observation satellites. These traditional satellites measure the water-leaving radiance, which is the signal returned from an area of water; the problem is that the signal is returned from a variety of different depths and it’s then aggregated to provide a single vertically integrated measurement for that area.

Diffuse attenuation depth at 490 nm, Kd(490), created from the SeaWiFS mission climatological data; data products retrieved from http://oceancolor.gsfc.nasa.gov/

Diffuse attenuation depth at 490 nm, Kd(490), created from the SeaWiFS mission climatological data; data products retrieved from http://oceancolor.gsfc.nasa.gov/

In effect, this means that a phytoplankon bloom at the surface will show up as a strong concentration on an image, however the same bloom at a deeper depth will show as having lower concentrations. The figure on the right shows the diffuse attenuation depth at 490 nm, blue light, created from the SeaWiFS mission climatological data collected between 1997 and 2010; the higher the value the shallower the depth of maximum passive light penetration. So, in summary, the light penetrates further within the open ocean than in many coastal waters that are more turbid.

The SABOR Lidar is based on lasers and will provide depth-resolved profiles, so instead of having a single value for an area of water, the measurements will be separable for different depths; expected to penetrate to around 50m. This will enable a much more detailed analysis of what’s happening within the water column. Satellite Lidar measurements have already been used to provide initial insights into the scattering of light resulting from phytoplankton through the CALIPSO satellite, an atmospheric focused Lidar mission launched in 2006.

In addition, the polarimeter element of SABOR will improve the quantification of the in-water constituents, such as the concentration of Chlorophyll-a (the primary pigment in most phytoplankton as well as land based plants) plus an understanding of the marine aerosols and clouds. Polarimeters have been launched before with the POLDER/PARASOL missions being examples.

The SABOR campaign will provide valuable information to support a proposal to have an Ocean Profiling Atmospheric Lidar (OPAL) deployed from the International Space Station (ISS) in 2015. If successful, it will join the existing Earth Observation mission on the ISS, called the Hyperspectral Imager for the Coastal Ocean (HICO), which I discussed in an earlier blog.

The potential offered by depth profiled oceanic measurements is exciting and will offer much more granularity beyond the ocean’s surface. I’m looking forward to the campaign’s results.