Optical Imagery is Eclipsed!

Solar eclipse across the USA captured by Suomi NPP VIIRS satellite on 21st August. Image courtesy of NASA/ NASA’s Earth Observatory.

Last week’s eclipse gave an excellent demonstration of the sun’s role in optical remote sensing. The image to the left was acquired on the 21st August by the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the NOAA/NASA Suomi NPP satellite, and the moon’s shadow can be clearly seen in the centre of the image.

Optical remote sensing images are the type most familiar to people as they use the visible spectrum and essentially show the world in a similar way to how the human eye sees it. The system works by a sensor aboard the satellite detecting sunlight reflected off the land or water – this process of light being scattered back towards the sensor by an object is known as reflectance.

Optical instruments collect data across a variety of spectral wavebands including those beyond human vision. However, the most common form of optical image is what is known as a pseudo true-colour composite which combines the red, green and blue wavelengths to produce an image which effectively matches human vision; i.e., in these images vegetation tends to be green, water blue and buildings grey. These are also referred to as RGB images.

These images are often enhanced by adjustments to the colour pallets of each of the individual wavelengths that allow the colours to stand out more, so the vegetation is greener and the ocean bluer than in the original data captured by the satellite. The VIIRS image above is an enhanced pseudo true-colour composite and the difference between the land and the ocean is clearly visible as are the white clouds.

As we noted above, optical remote sensing works by taking the sunlight reflected from the land and water. Therefore during the eclipse the moon’s shadow means no sunlight reaches the Earth beneath, causing the circle of no reflectance (black) in the centre of the USA. This is also the reason why no optical imagery is produced at night.

This also explains why the nemesis of optical imagery is clouds! In cloudy conditions, the sunlight is reflected back to the sensor by the clouds and does not reach the land or water. In this case the satellite images simply show swirls of white!

Mosaic composite image of solar eclipse over the USA on the 21st August 2017 acquired by MODIS. .Image courtesy of NASA Earth Observatory images by Joshua Stevens and Jesse Allen, using MODIS data from the Land Atmosphere Near real-time Capability for EOS (LANCE) and EOSDIS/Rapid Response

A second eclipse image was produced from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite. Shown on the left this is a mosaic image from the 21st August, where:

  • The right third of the image shows the eastern United States at about 12:10 p.m. Eastern Time, before the eclipse had begun.
  • The middle part was captured at about 12:50 p.m. Central Time during the eclipse.
  • The left third of the image was collected at about 12:30 p.m. Pacific Time, after the eclipse had ended.

Again, the moon’s shadow is obvious from the black area on the image.

Hopefully, this gives you a bit of an insight into how optical imagery works and why you can’t get optical images at night, under cloudy conditions or during an eclipse!

Algae Starting To Bloom

Algal Blooms in Lake Erie, around Monroe, acquired by Sentinel-2 on 3rd August 2017. Data Courtesy of ESA/Copernicus.

Algae have been making the headlines in the last few weeks, which is definitely a rarely used phrase!

Firstly, the Lake Erie freshwater algal bloom has begun in the western end of the lake near Toledo. This is something that is becoming an almost annual event and last year it interrupted the water supply for a few days for around 400,000 residents in the local area.

An algae bloom refers to a high concentration of micro algae, known as phytoplankton, in a body of water. Blooms can grow quickly in nutrient rich waters and potentially have toxic effects. Although a lot of algae is harmless, the toxic varieties can cause rashes, nausea or skin irritation if you were to swim in it, it can also contaminate drinking water and can enter the food chain through shellfish as they filter large quantities of water.

Lake Erie is fourth largest of the great lakes on the US/Canadian border by surface area, measuring around 25,700 square km, although it’s also the shallowest and at 484 cubic km has the smallest water volume. Due to its southern position it is the warmest of the great lakes, something which may be factor in creation of nutrient rich waters. The National Oceanic and Atmospheric Administration produce both an annual forecast and a twice weekly Harmful Algal Bloom Bulletin during the bloom season which lasts until late September. The forecast reflects the expected biomass of the bloom, but not its toxicity, and this year’s forecast was 7.5 on a scale to 10, the largest recent blooms in 2011 and 2015 both hit the top of the scale. Interestingly, this year NOAA will start incorporating Sentinel-3 data into the programme.

Western end of Lake Erie acquired by Sentinel-2 on 3rd August 2017. Data

Despite the phytoplankton within algae blooms being only 1,000th of a millimetre in size, the large numbers enable them to be seen from space. The image to the left is a Sentinel-2 image, acquired on the 3rd August, of the western side of the lake where you can see the green swirls of the algal bloom, although there are also interesting aircraft contrails visible in the image. The image at the start of the top of the blog is zoomed in to the city of Monroe and the Detroit River flow into the lake and the algal bloom is more prominent.

Landsat 8 acquired this image of the northwest coast of Norway on the 23rd July 2017,. Image courtesy of NASA/NASA Earth Observatory.

It’s not just Lake Erie where algal blooms have been spotted recently:

  • The Chautauqua Lake and Findley Lake, which are both just south of Lake Erie, have reported algal blooms this month.
  • NASA’s Landsat 8 satellite captured the image on the right, a bloom off the northwest coast of Norway on the 23rd July. It is noted that blooms at this latitude are in part due to the sunlight of long summer days.
  • The MODIS instrument onboard NASA’s Aqua satellite acquired the stunning image below of the Caspian Sea on the 3rd August.

Image of the Caspian Sea, acquired on 3rd August 2017, by MODIS on NASA’s Aqua satellite. Image Courtesy of NASA/NASA Earth Observatory.

Finally as reported by the BBC, an article in Nature this week proposes that it was a takeover by ocean algae 650 million years ago which essentially kick started life on Earth as we know it.

So remember, they may be small, but algae can pack a punch!

If no-one is there when an iceberg is born, does anyone see it?

Larsen C ice Shelf including A68 iceberg. Image acquired by MODIS Aqua satellite on 12th July 2017. Image courtesy of NASA.

The titular paraphrasing of the famous falling tree in the forest riddle was well and truly answered this week, and shows just how far satellite remote sensing has come in recent years.

Last week sometime between Monday 10th July and Wednesday 12th July 2017, a huge iceberg was created by splitting off the Larsen C Ice Shelf in Antarctica. It is one of the biggest icebergs every recorded according to scientists from Project MIDAS, a UK-based Antarctic research project, who estimate its area of be 5,800 sq km and to have a weight of more a trillion tonnes. It has reduced the Larsen C ice Shelf by more than twelve percent.

The iceberg has been named A68, which is a pretty boring name for such a huge iceberg. However, icebergs are named by the US National Ice Centre and the letter comes from where the iceberg was originally sited – in this case the A represents area zero degrees to ninety degrees west covering the Bellingshausen and Weddell Seas. The number is simply the order that they are discovered, which I assume means there have been 67 previous icebergs!

After satisfying my curiosity on the iceberg names, the other element that caught our interest was the host of Earth observation satellites that captured images of either the creation, or the newly birthed, iceberg. The ones we’ve spotted so far, although there may be others, are:

  • ESA’s Sentinel-1 has been monitoring the area for the last year as an iceberg splitting from Larsen C was expected. Sentinel-1’s SAR imagery has been crucial to this monitoring as the winter clouds and polar darkness would have made optical imagery difficult to regularly collect.
  • Whilst Sentinel-1 was monitoring the area, it was actually NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite which confirmed the ‘birth’ on the 12th July with a false colour image at 1 km spatial resolution using band 31 which measures infrared signals. This image is at the top of the blog and the dark blue shows where the surface is warmest and lighter blue indicates a cooler surface. The new iceberg can be seen in the centre of the image.
  • Longwave infrared imagery was also captured by the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite on July 13th.
  • Similarly, NASA also reported that Landsat 8 captured a false-colour image from its Thermal Infrared Sensor on the 12th July showing the relative warmth or coolness of the Larsen C ice shelf – with the area around the new iceberg being the warmest giving an indication of the energy involved in its creation.
  • Finally, Sentinel-3A has also got in on the thermal infrared measurement using the bands of its Sea and Land Surface Temperature Radiometer (SLSTR).
  • ESA’s Cryosat has been used to calculate the size of iceberg by using its Synthetic Aperture Interferometric Radar Altimeter (SIRAL) which measured height of the iceberg out of the water. Using this data, it has been estimated that the iceberg contains around 1.155 cubic km of ice.
  • The only optical imagery we’ve seen so far is from the DEMIOS1 satellite which is owned by Deimos Imaging, an UrtheCast company. This is from the 14th July and revealed that the giant iceberg was already breaking up into smaller pieces.

It’s clear this is a huge iceberg, so huge in fact that most news agencies don’t think that readers can comprehend its vastness, and to help they give a comparison. Some of the ones I came across to explain its vastness were:

  • Size of the US State of Delaware
  • Twice the size of Luxembourg
  • Four times the size of greater London
  • Quarter of the size of Wales – UK people will know that Wales is almost an unofficial unit of size measurement in this country!
  • Has the volume of Lake Michigan
  • Has the twice the volume of Lake Erie
  • Has the volume of the 463 million Olympic-sized swimming pools; and
  • My favourite compares its size to the A68 road in the UK, which runs from Darlington to Edinburgh.

This event shows how satellites are monitoring the planet, and the different ways we can see the world changing.

Locusts & Monkeys

Soil moisture data from the SMOS satellite and the MODIS instrument acquired between July and October 2016 were used by isardSAT and CIRAD to create this map showing areas with favourable locust swarming conditions (in red) during the November 2016 outbreak. Data courtesy of ESA. Copyright : CIRAD, SMELLS consortium.

Spatial resolution is a key characteristic in remote sensing, as we’ve previously discussed. Often the view is that you need an object to be significantly larger than the resolution to be able to see it on an image. However, this is not always the case as often satellites can identify indicators of objects that are much smaller.

We’ve previously written about satellites identifying phytoplankton in algal blooms, and recently two interesting reports have described how satellites are being used to determine the presence of locusts and monkeys!

Locusts

Desert locusts are a type of grasshopper, and whilst individually they are harmless as a swarm they can cause huge damage to populations in their paths. Between 2003 and 2005 a swarm in West Africa affected eight million people, with reported losses of 100% for cereals, 90% for legumes and 85% for pasture.

Swarms occur when certain conditions are present; namely a drought, followed by rain and vegetation growth. ESA and the UN Food and Agriculture Organization (FAO) have being working together to determine if data from the Soil Moisture and Ocean Salinity (SMOS) satellite can be used to forecast these conditions. SMOS carries a Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) instrument – a 2D interferometric L-band radiometer with 69 antenna receivers distributed on a Y-shaped deployable antenna array. It observes the ‘brightness temperature’ of the Earth, which indicates the radiation emitted from planet’s surface. It has a temporal resolution of three days and a spatial resolution of around 50 km.

By combining the SMOS soil moisture observations with data from NASA’s MODIS instrument, the team were able to downscale SMOS to 1km spatial resolution and then use this data to create maps. This approach then predicted favourable locust swarming conditions approximately 70 days ahead of the November 2016 outbreak in Mauritania, giving the potential for an early warning system.

This is interesting for us as we’re currently using soil moisture data in a project to provide an early warning system for droughts and floods.

Monkeys

Earlier this month the paper, ‘Connecting Earth Observation to High-Throughput Biodiversity Data’, was published in the journal Nature Ecology and Evolution. It describes the work of scientists from the Universities of Leicester and East Anglia who have used satellite data to help identify monkey populations that have declined through hunting.

The team have used a variety of technologies and techniques to pull together indicators of monkey distribution, including:

  • Earth observation data to map roads and human settlements.
  • Automated recordings of animal sounds to determine what species are in the area.
  • Mosquitos have been caught and analysed to determine what they have been feeding on.

Combining these various datasets provides a huge amount of information, and can be used to identify areas where monkey populations are vulnerable.

These projects demonstrate an interesting capability of satellites, which is not always recognised and understood. By using satellites to monitor certain aspects of the planet, the data can be used to infer things happening on a much smaller scale than individual pixels.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Earth observation satellites in space in 2016

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972. Image Credit: NASA

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972.
Image Credit: NASA

Earth Observation (EO) satellites account for just over one quarter of all the operational satellites currently orbiting the Earth. As noted last week there are 1 419 operational satellites, and 374 of these have a main purpose of either EO or Earth Science.

What do Earth observation satellites do?
According to the information within the Union of Concerned Scientists database, the main purpose of the current operational EO satellites are:

  • Optical imaging for 165 satellites
  • Radar imaging for 34 satellites
  • Infrared imaging for 7 satellites
  • Meteorology for 37 satellites
  • Earth Science for 53 satellites
  • Electronic Intelligence for 47 satellites
  • 6 satellites with other purposes; and
  • 25 satellites simply list EO as their purpose

Who Controls Earth observation satellites?
There are 34 countries listed as being the main controllers of EO satellites, although there are also a number of joint and multinational satellites – such as those controlled by the European Space Agency (ESA). The USA is the leading country, singularly controlling one third of all EO satellites – plus they are joint controllers in others. Of course, the data from some of these satellites are widely shared across the world, such as Landsat, MODIS and SMAP (Soil Moisture Active Passive) missions.

The USA is followed by China with about 20%, and Japan and Russia come next with around 5% each. The UK is only listed as controller on 4 satellites all related to the DMC constellation, although we are also involved in the ESA satellites.

Who uses the EO satellites?
Of the 374 operational EO satellites, the main users are:

  • Government users with 164 satellites (44%)
  • Military users with 112 satellites (30%)
  • Commercial users with 80 satellites (21%)
  • Civil users with 18 satellites (5%)

It should be noted that some of these satellites do have multiple users.

Height and Orbits of Earth observation satellites
In terms of operational EO satellite altitudes:

  • 88% are in a Low Earth Orbit, which generally refers to altitudes of between 160 and 2 000 kilometres (99 and 1 200 miles)
  • 10% are in a geostationary circular orbit at around 35 5000 kilometres (22 200 miles)
  • The remaining 2% are described as having an elliptical orbit.

In terms of the types of orbits:

  • 218 are in a sun-synchronous orbit
  • 84 in non-polar inclined orbit
  • 16 in a polar orbit
  • 17 in other orbits including elliptical, equatorial and molniya orbit; and finally
  • 39 do not have an orbit recorded.

What next?

Our first blog of 2016 noted that this was going to be an exciting year for EO, and it is proving to be the case. We’ve already seen the launches of Sentinel-1B, Sentinel-3A, Jason-3, GaoFen3 carrying a SAR instrument and further CubeSat’s as part of Planet’s Flock imaging constellation.

The rest of the year looks equally exciting with planned launches for Sentinel-2B, Japan’s Himawari 9, India’s INsat-3DR, DigitalGlobe’s Worldview 4 and NOAA’s Geostationary Operational Environmental Satellite R-Series Program (GOES-R). We can’t wait to see all of this data in action!

Monitoring ocean acidification from space

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

What is ocean acidification?
Since the industrial revolution the oceans have absorbed approximately 50% of the CO2 produced by human activities (The Royal Society, 2005). Scientists previously saw this oceanic absorption as advantageous, however ocean observations in recent decades have shown it has caused a profound change in the ocean chemistry – resulting in ocean acidification (OA); as CO2 dissolves into the oceans it forms carbonic acid, lowering the pH and moving the oceans into a more acidic state. According to the National Oceanic Atmospheric Administration (NOAA) ocean pH has already decreased by about 30% and some studies suggest that if no changes are made, by 2100, ocean pH will decrease by 150%.

Impacts of OA
It’s anticipated OA will impact many marine species. For example, it’s expected it will have a harmful effect on some calcifying species such as corals, oysters, crustaceans, and calcareous plankton e.g. coccolithophores.

OA can significantly reduce the ability of reef-building corals to produce their skeletons and can cause the dissolution of oyster’s and crustacean’s protective shells, making them more susceptible to predation and death. This in turn would affect the entire food web, the wider environment and would have many socio-economic impacts.

Calcifying phytoplankton, such as coccolithophores, are thought to be especially vulnerable to OA. They are the most abundant type of calcifying phytoplankton in the ocean, and are important for the global biogeochemical cycling of carbon and are the base of many marine food webs. It’s projected that OA may disrupt the formation and/or dissolution of coccolithophores, calcium carbonate (CaCO3) shells, impacting future populations. Thus, changes in their abundance due to OA could have far-reaching effects.

Unlike other phytoplankton, coccolithophores are highly effective light scatterers relative to their surroundings due to their production of highly reflective calcium carbonate plates. This allows them to be easily seen on satellite imagery. The figure at the top of this page shows multiple coccolithophore blooms, in light blue, off the coast of the United Kingdom on 24th March 2016.

Current OA monitoring methods
Presently, the monitoring of OA and its effects are predominantly carried out by in situ observations from ships and moorings using buoys and wave gliders for example. Although vital, in situ data is notoriously spatially sparse as it is difficult to take measurements in certain areas of the world, especially in hostile regions (e.g. Polar Oceans). On their own they do not provide a comprehensive and cost-effective way to monitor OA globally. Consequently, this has driven the development of satellite-based sensors.

How can OA be monitored from space?
Although it is difficult to directly monitor changes in ocean pH using remote sensing, satellites can measure sea surface temperature and salinity (SST & SSS) and surface chlorophyll-a, from which ocean pH can be estimated using empirical relationships derived from in situ data. Although surface measurements may not be representative of deeper biological processes, surface observations are important for OA because the change in pH occurs at the surface first.

In 2015 researchers at the University of Exeter, UK became the first scientists to use remote sensing to develop a worldwide map of the ocean’s acidity using satellite imagery from the European Space Agency’s Soil Moisture and Ocean Salinity (SMOS) satellite that was launched in 2009 and NASA’s Aquarius satellite that was launched in 2011; both are still currently in operation. Thermal mounted sensors on the satellites measure the SST while the microwave sensors measure SSS; there are also microwave SST sensors, but they have a coarse spatial resolution.

Future Opportunities – The Copernicus Program
The European Union’s Copernicus Programme is in the process of launching a series of satellites, known as Sentinel satellites, which will improve understanding of large scale global dynamics and climate change. Of all the Sentinel satellite types, Sentinels 2 and 3 are most appropriate for assessment of the marine carbonate system. The Sentinel-3 satellite was launched in February this year andwill be mainly focussing on ocean measurements, including SST, ocean colour and chlorophyll-a.

Overall, OA is a relatively new field of research, with most of the studies being conducted over the last decade. It’s certain that remote sensing will have an exciting and important role to play in the future monitoring of this issue and its effects on the marine environment.

Blog written by Charlie Leaman, BSc, University of Bath during work placement at Pixalytics.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!

Four Fantastic Forestry Applications of Remote Sensing

Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Monitoring forest biomass is essential for understanding the global carbon cycle because:

  • Forests account for around 45 % of terrestrial carbon, and deforestation accounts for 10% of greenhouse gas emissions
  • Deforestation and forest degradation release approximately identical amounts of greenhouse gases as all the world’s road traffic
  • Forests sequester significant amounts of carbon every year

The United Nations (UN) intergovernmental Reducing Emissions from Deforestation and forest Degradation in developing countries (REDD+) programme, was secured in 2013 during the 19th Conference of the Parties to the UN Framework Convention on Climate Change. It requires countries to map and monitor deforestation and forest degradation, together with developing a system of sustainable forest management. Remote sensing can play a great role in helping to deliver these requirements, and below are three fantastic remote sensing initiatives in this area.

Firstly, the Real Time System for Detection of Deforestation (DETER) gives monthly alerts on potential areas of deforestation within Amazon rainforests. It uses data from MODIS, at 250 m pixel resolution, within a semi-automated classification technique. A computer model detects changes in land use and cover such as forest clearing that are then validated by interpreters. It has been valuable helping Brazil to reduce deforestation rates by around 80% over the last decade; however, it takes two weeks to produce the output of this computer model.

Zoomed in Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Zoomed in Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

A similar initiative is FORest Monitoring for Action (FORMA), which also use MODIS data. FORMA is fully automated computer model which combines vegetation reflectance data from MODIS, active fires from NASA’s Fire Information for Resource Management and rainfall figures, to identify potential forest clearing. Like DETER it produces alerts twice a month, although it works on tropical humid forests worldwide.

A third initiative aims to provide faster alerts for deforestation using the research by Hansen et al, published in 2013. The researchers used successive passes of the current Landsat satellites to monitor land cover, and when gaps appear between these passes it is flagged. These will be displayed on an online map, and the alerts will be available through the Word Resources Institute’s Global Forest Watch website, starting in March 2016. With the 30 m resolution of Landsat, smaller scale changes in land use can be detected than is possible for sensors such as MODIS. Whilst this is hoped to help monitor deforestation, it doesn’t actually determine it, as they could be other reasons for the tree loss and further investigation will be required. Being an optical mission, Landsat has problems seeing both through clouds and beneath the forestry canopy, and so it’s use will be limited in areas such as tropical rain forests.

Finally, one way of combat the weather and satellite canopy issue is to use radar to assess forests, and the current AfriSAR project in Gabon is doing just that – although it’s with flights and Unmanned Aerial Vehicles (UAV) rather than satellites. It began in the 2015 with overflights during the dry season, and the recent flights in February 2016 captured the rainy season. This joint ESA, Gabonese Space Agency and Gabon Agency of National Parks initiative aims of the project is to determine the amount of biomass and carbon stored in forests, by using the unique sensitivity of P-band SAR, the lowest radar frequency used in remote sensing at 432–438 MHz. NASA joined the recent February missions adding its Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) and the Land, Vegetation and Ice Sensor (LVIS) instrument, which are prototypes of sensors to be used on future NASA missions. Overall, this is giving a unique dataset on the tropical forests.

These are just four example projects of how remote sensing can contribute towards towards understanding what is happening in the world’s forests.

Why Satellite Agri-Tech Applications Will Grow In 2016?

Pixalytics-show preview image2016 is likely to be the year of agri-tech for remote sensing. Its potential has been highlighted for some time, but last year its call was loud and clear.

Agri-tech is the use of technology to improve agriculture production in terms of yield, efficiency and profitability. With a growing global population the need to become more effective and sustainable food producers is obvious, and technology can assist in terms of robotics, biotechnology, navigation, communication, etc. However, it’s opportunities offered by remote sensing that’s most exciting to us – of course, we’re probably biased!

Remote sensing has a wide range of applications for agriculture that range from mapping the underlying soil and crop plus the monitoring of invasive species through to defining seed density optimisation, irrigation management, harvest weather forecasting, yield estimation and long term land change / land use modelling. Essentially, we can offer support from planting to plating!

Despite this potential, uptake within the agricultural sector has been low. A survey of farmers by London Economics / the Satellite Applications Catapult last summer identified barriers that included cost, small-scale justification, reliable mobile / internet signal, lack of software to view data, lack of knowledge and the lack of proven benefits.

So with all of these issues, why are we saying agri-tech will grow in 2016? There are three good reasons:

Benefits Examples – Case studies with concrete examples of the usage of remote sensing are being published. For example, NASA and Applied Geosolutions, worked together using Landsat 8 and MODIS data to examine temperature, greenness, leaf moisture and surface water. This allowed them to develop rice crop management plans, particularly surrounding irrigation, improving both harvest forecasts and actual yields.

Copernicus Sentinel – I know we’ve said this before, but it’s worth saying again, this is a game changer. Both Sentinel-1 and Sentinel-2 data have signals that can be related to vegetation phenology, i.e. how plants change over time. As this data is free, it should allow companies to offer farmers products and services that are not cost prohibitive. Also, as the follow-on missions are launched then the frequency of data coverage will increase – particularly important for optical sensors where clouds can get in the way. Pixalytics has a Sentinel-2 vegetation product in test, which has already been applied to Landsat and very high resolution data, so it’s an area we’re looking to develop further – the image shows a Landsat-8 image processed over land using a Normalised Difference Vegetation Index (NDVI) based algorithm.

Other Data – In June the Department for Environment, Food and Rural Affairs will be making over 8,000 data sets freely available that should cover information such as soil and crop types for fields all over the country. It will provide a wealth of information for farmers to understand what crops they should be growing in which fields to maximise their yields. In addition, the UK’s National Biodiversity Network offers air quality and river level readings.

Taken together these elements offer new opportunities for SME’s to get involved and develop products that will offer real benefits to farmers, both large and small, and will overcome the barriers to them utilising agri-tech. For the right company, with the right idea and right implementation then 2016 will be a high yield year!

If you are interesting in agri-tech and would like to talk to us about what can be done, and what we could offer then please get in touch.