Flip-Sides of Soil Moisture

Soil Moisture changes between 19th and 25th August around Houston, Texas due to rainfall from Hurricane Harvey. Courtesy of NASA Earth Observatory image by Joshua Stevens, using soil moisture data courtesy of JPL and the SMAP science team.

Soil moisture is an interesting measurement as it can be used to monitor two diametrically opposed conditions, namely floods and droughts. This was highlighted last week by maps produced from satellite data for the USA and Italy respectively. These caught our attention because soil moisture gets discussed on a daily basis in the office, due to its involvement in a project we’re working on in Uganda.

Soil moisture can have a variety of meanings depending on the context. For this blog we’re using soil moisture to describe the amount of water held in spaces between the soil in the top few centimetres of the ground. Data is collected by radar satellites which measure microwaves reflected or emitted by the Earth’s surface. The intensity of the signal depends on the amount of water in the soil, enabling a soil moisture content to be calculated.

Floods
You can’t have failed to notice the devastating floods that have occurred recently in South Asia – particularly India, Nepal and Bangladesh – and in the USA. The South Asia floods were caused by monsoon rains, whilst the floods in Texas emanated from Hurricane Harvey.

Soil moisture measurements can be used to show the change in soil saturation. NASA Earth Observatory produced the map at the top of the blogs shows the change in soil moisture between the 19th and 25th August around Houston, Texas. The data is based on measurements acquired by the Soil Moisture Active Passive (SMAP) satellite, which uses a radiometer to measure soil moisture in the top 5 centimetres of the ground with a spatial resolution of around 9 km. On the map itself the size of each of the hexagons shows how much the level of soil moisture changed and the colour represents how saturated the soil is.

These readings have identified that soil moisture levels got as high as 60% in the immediate aftermath of the rainfall, partly due to the ferocity of the rain, which prevented the water from seeping down into the soil and so it instead remained at the surface.

Soil moisture in Italy during early August 2017. The data were compiled by ESA’s Soil Moisture CCI project. Data couresy of ESA. Copyright: C3S/ECMWF/TU Wien/VanderSat/EODC/AWST/Soil Moisture CCI

Droughts
By contrast, Italy has been suffering a summer of drought and hot days. This year parts of the country have not seen rain for months and the temperature has regularly topped one hundred degrees Fahrenheit – Rome, which has seventy percent less rainfall than normal, is planning to reduce water pressure at night for conservation efforts.

This has obviously caused an impact on the ground, and again a soil moisture map has been produced which demonstrates this. This time the data was come from the ESA’s Soil Moisture Climate Change Initiative project using soil moisture data from a variety of satellite instruments. The dataset was developed by the Vienna University of Technology with the Dutch company VanderSat B.V.

The map shows the soil moisture levels in Italy from the early part of last month, with the more red the areas, the lower the soil moisture content.

Summary
Soil moisture is a fascinating measurement that can provide insights into ground conditions whether the rain is falling a little or a lot.

It plays an important role in the development of weather patterns and the production of precipitation, and is crucial to understanding both the water and carbon cycles that impact our weather and climate.

Optical Imagery is Eclipsed!

Solar eclipse across the USA captured by Suomi NPP VIIRS satellite on 21st August. Image courtesy of NASA/ NASA’s Earth Observatory.

Last week’s eclipse gave an excellent demonstration of the sun’s role in optical remote sensing. The image to the left was acquired on the 21st August by the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the NOAA/NASA Suomi NPP satellite, and the moon’s shadow can be clearly seen in the centre of the image.

Optical remote sensing images are the type most familiar to people as they use the visible spectrum and essentially show the world in a similar way to how the human eye sees it. The system works by a sensor aboard the satellite detecting sunlight reflected off the land or water – this process of light being scattered back towards the sensor by an object is known as reflectance.

Optical instruments collect data across a variety of spectral wavebands including those beyond human vision. However, the most common form of optical image is what is known as a pseudo true-colour composite which combines the red, green and blue wavelengths to produce an image which effectively matches human vision; i.e., in these images vegetation tends to be green, water blue and buildings grey. These are also referred to as RGB images.

These images are often enhanced by adjustments to the colour pallets of each of the individual wavelengths that allow the colours to stand out more, so the vegetation is greener and the ocean bluer than in the original data captured by the satellite. The VIIRS image above is an enhanced pseudo true-colour composite and the difference between the land and the ocean is clearly visible as are the white clouds.

As we noted above, optical remote sensing works by taking the sunlight reflected from the land and water. Therefore during the eclipse the moon’s shadow means no sunlight reaches the Earth beneath, causing the circle of no reflectance (black) in the centre of the USA. This is also the reason why no optical imagery is produced at night.

This also explains why the nemesis of optical imagery is clouds! In cloudy conditions, the sunlight is reflected back to the sensor by the clouds and does not reach the land or water. In this case the satellite images simply show swirls of white!

Mosaic composite image of solar eclipse over the USA on the 21st August 2017 acquired by MODIS. .Image courtesy of NASA Earth Observatory images by Joshua Stevens and Jesse Allen, using MODIS data from the Land Atmosphere Near real-time Capability for EOS (LANCE) and EOSDIS/Rapid Response

A second eclipse image was produced from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite. Shown on the left this is a mosaic image from the 21st August, where:

  • The right third of the image shows the eastern United States at about 12:10 p.m. Eastern Time, before the eclipse had begun.
  • The middle part was captured at about 12:50 p.m. Central Time during the eclipse.
  • The left third of the image was collected at about 12:30 p.m. Pacific Time, after the eclipse had ended.

Again, the moon’s shadow is obvious from the black area on the image.

Hopefully, this gives you a bit of an insight into how optical imagery works and why you can’t get optical images at night, under cloudy conditions or during an eclipse!

Silver Anniversary for Ocean Altimetry Space Mission

Artist rendering of Jason-3 satellite over the Amazon.
Image Courtesy NASA/JPL-Caltech.

August 10th 1992 marked the launch of the TOPEX/Poseidon satellite, the first major oceanographic focussed mission. Twenty five years, and three successor satellites, later the dataset begun by TOPEX/Poseidon is going strong providing sea surface height measurements.

TOPEX/Poseidon was a joint mission between NASA and France’s CNES space agency, with the aim of mapping ocean surface topography to improve our understanding of ocean currents and global climate forecasting. It measured ninety five percent of the world’s ice free oceans within each ten day revisit cycle. The satellite carried two instruments: a single-frequency Ku-band solid-state altimeter and a dual-frequency C- and Ku-band altimeter sending out pulses at 13.6 GHz and 5.3 GHz respectively. The two bands were selected due to atmospheric sensitivity, as the difference between them provides estimates of the ionospheric delay caused by the charged particles in the upper atmosphere that can delay the returned signal. The altimeter sends radio pulses towards the earth and measures the characteristics of the returned echo.

When TOPEX/Poseidon altimetry data is combined with other information from the satellite, it was able to calculate sea surface heights to an accuracy of 4.2 cm. In addition, the strength and shape of the return signal also allow the determination of wave height and wind speed. Despite TOPEX/Poseidon being planned as a three year mission, it was actually active for thirteen years, until January 2006.

The value in the sea level height measurements resulted in a succeeding mission, Jason-1, launched on December 7th 2001. It was put into a co-ordinated orbit with TOPEX/Poseidon and they both took measurements for three years, which allowed both increased data frequency and the opportunity for cross calibration of the instruments. Jason-1 carried a CNES Poseidon-2 Altimeter using the same C- and Ku-bands, and following the same methodology it had the ability to measure sea-surface height to an improved accuracy of 3.3 cm. It made observations for 12 years, and was also overlapped by its successor Jason-2.

Jason-2 was launched on the 20 June 2008. This satellite carried a CNES Poseidon-3 Altimeter with C- and Ku-bands with the intention of measuring sea height to within 2.5cm. With Jason-2, National Oceanic and Atmospheric Administration (NOAA) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) took over the management of the data. The satellite is still active, however due to suspected radiation damage its orbit was lowered by 27 km, enabling it to produce an improved, high-resolution estimate of Earth’s average sea surface height, which in turn will help improve the quality of maps of the ocean floor.

Following the established pattern, Jason-3 was launched on the 17th January 2016. It’s carrying a Poseidon-3B radar altimeter, again using the same C and Ku bands and on a ten day revisit cycle.

Together these missions have provided a 25 year dataset on sea surface height, which has been used for applications such as:

  • El Niño and La Niña forecasting
  • Extreme weather forecasting for hurricanes, floods and droughts
  • Ocean circulation modelling for seasons and how this affects climate through by moving heat around the globe
  • Tidal forecasting and showing how this energy plays an important role in mixing water within the oceans
  • Measurement of inland water levels – at Pixalytics we have a product that we have used to measure river levels in the Congo and is part of the work we are doing on our International Partnership Programme work in Uganda.

In the future, the dataset will be taken forward by the Jason Continuity of Service (Jason-CS) on the Sentinel-6 ocean mission which is expected to be launched in 2020.

Overall, altimetry data from this series of missions is a fantastic resource for operational oceanography and inland water applications, and we look forward to its next twenty five years!

Landsat Turns 45!

False colour image of Dallas, Texas. The first fully operational Landsat image taken on July 25, 1972, Image courtesy: NASA’s Earth Observatory

Landsat has celebrated forty-five years of Earth observation this week. The first Landsat mission was Earth Resources Technology Satellite 1 (ERTS-1), which was launched into a sun-synchronous near polar orbit on the 23 July 1972. It wasn’t renamed Landsat-1 until 1975. It had an anticipated life of 1 year and carried two instruments: the Multi Spectral Scanner (MSS) and the Return-Beam Vidicon (RBV).

The Landsat missions have data continuity at their heart, which has given a forty-five year archive of Earth observation imagery. However, as technological capabilities have developed the instruments on consecutive missions have improved. To demonstrate and celebrate this, NASA has produced a great video showing the changing coastal wetlands in Atchafalaya Bay, Louisiana, through the eyes of the different Landsat missions.

In total there have been eight further Landsat missions, but Landsat 6 failed to reach its designated orbit and never collected any data. The missions have been:

  • Landsat 1 launched on 23 July 1972.
  • Landsat 2 launched on 22 January 1975.
  • Landsat 3 was launched on 5 March 1978.
  • Landsat 4 launched on 16 July 1982.
  • Landsat 5 launched on 1 March 1984.
  • Landsat 7 launched on 15 April 1999, and is still active.
  • Landsat 8 launched on 11 February 2013, and is still active.

Landsat 9 is planned to be launched at the end 2020 and Landsat 10 is already being discussed.

Some of the key successes of the Landsat mission include:

  • Over 7 million scenes of the Earth’s surface.
  • Over 22 million scenes had been downloaded through the USGS-EROS website since 2008, when the data was made free-to-access, with the rate continuing to increase (Campbell 2015).
  • Economic value of just one year of Landsat data far exceeds the multi-year total cost of building, launching, and managing Landsat satellites and sensors.
  • Landsat 5 officially set a new Guinness World Records title for the ‘Longest-operating Earth observation satellite’ with its 28 years and 10 months of operation when it was decommissioned in December 2012.
  • ESA provides Landsat data downlinked via their own data receiving stations; the ESA dataset includes data collected over the open ocean, whereas USGS does not, and the data is processed using ESA’s own processor.

The journey hasn’t always been smooth. Although established by NASA, Landsat was transferred to the private sector under the management of NOAA in the early 1980’s, before returning to US Government control in 1992. There have also been technical issues, the failure of Landsat 6 described above; and Landsat 7 suffering a Scan Line Corrector failure on the 31st May 2003 which means that instead of mapping in straight lines, a zigzag ground track is followed. This causes parts of the edge of the image not to be mapped, giving a black stripe effect within these images; although the centre of the images is unaffected the data overall can still be used.

Landsat was certainly a game changer in the remote sensing and Earth observation industries, both in terms of the data continuity approach and the decision to make the data free to access. It has provided an unrivalled archive of the changing planet which has been invaluable to scientists, researchers, book-writers and businesses like Pixalytics.

We salute Landsat and wish it many more years!

If no-one is there when an iceberg is born, does anyone see it?

Larsen C ice Shelf including A68 iceberg. Image acquired by MODIS Aqua satellite on 12th July 2017. Image courtesy of NASA.

The titular paraphrasing of the famous falling tree in the forest riddle was well and truly answered this week, and shows just how far satellite remote sensing has come in recent years.

Last week sometime between Monday 10th July and Wednesday 12th July 2017, a huge iceberg was created by splitting off the Larsen C Ice Shelf in Antarctica. It is one of the biggest icebergs every recorded according to scientists from Project MIDAS, a UK-based Antarctic research project, who estimate its area of be 5,800 sq km and to have a weight of more a trillion tonnes. It has reduced the Larsen C ice Shelf by more than twelve percent.

The iceberg has been named A68, which is a pretty boring name for such a huge iceberg. However, icebergs are named by the US National Ice Centre and the letter comes from where the iceberg was originally sited – in this case the A represents area zero degrees to ninety degrees west covering the Bellingshausen and Weddell Seas. The number is simply the order that they are discovered, which I assume means there have been 67 previous icebergs!

After satisfying my curiosity on the iceberg names, the other element that caught our interest was the host of Earth observation satellites that captured images of either the creation, or the newly birthed, iceberg. The ones we’ve spotted so far, although there may be others, are:

  • ESA’s Sentinel-1 has been monitoring the area for the last year as an iceberg splitting from Larsen C was expected. Sentinel-1’s SAR imagery has been crucial to this monitoring as the winter clouds and polar darkness would have made optical imagery difficult to regularly collect.
  • Whilst Sentinel-1 was monitoring the area, it was actually NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite which confirmed the ‘birth’ on the 12th July with a false colour image at 1 km spatial resolution using band 31 which measures infrared signals. This image is at the top of the blog and the dark blue shows where the surface is warmest and lighter blue indicates a cooler surface. The new iceberg can be seen in the centre of the image.
  • Longwave infrared imagery was also captured by the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite on July 13th.
  • Similarly, NASA also reported that Landsat 8 captured a false-colour image from its Thermal Infrared Sensor on the 12th July showing the relative warmth or coolness of the Larsen C ice shelf – with the area around the new iceberg being the warmest giving an indication of the energy involved in its creation.
  • Finally, Sentinel-3A has also got in on the thermal infrared measurement using the bands of its Sea and Land Surface Temperature Radiometer (SLSTR).
  • ESA’s Cryosat has been used to calculate the size of iceberg by using its Synthetic Aperture Interferometric Radar Altimeter (SIRAL) which measured height of the iceberg out of the water. Using this data, it has been estimated that the iceberg contains around 1.155 cubic km of ice.
  • The only optical imagery we’ve seen so far is from the DEMIOS1 satellite which is owned by Deimos Imaging, an UrtheCast company. This is from the 14th July and revealed that the giant iceberg was already breaking up into smaller pieces.

It’s clear this is a huge iceberg, so huge in fact that most news agencies don’t think that readers can comprehend its vastness, and to help they give a comparison. Some of the ones I came across to explain its vastness were:

  • Size of the US State of Delaware
  • Twice the size of Luxembourg
  • Four times the size of greater London
  • Quarter of the size of Wales – UK people will know that Wales is almost an unofficial unit of size measurement in this country!
  • Has the volume of Lake Michigan
  • Has the twice the volume of Lake Erie
  • Has the volume of the 463 million Olympic-sized swimming pools; and
  • My favourite compares its size to the A68 road in the UK, which runs from Darlington to Edinburgh.

This event shows how satellites are monitoring the planet, and the different ways we can see the world changing.

AgriTech Seeds Start to Grow in Cornwall

On Monday I attended the Jump Start AgriTech event hosted by the South West Centre of Excellence in Satellite Applications at the Tremough Innovation Centre on the University of Exeter’s Penryn campus near Falmouth in Cornwall. As the name suggests the one day event covered innovations in AgriTech with a particular focus on what is, or could be, happening in the South West.

The day began with a series of short presentations and Paul Harris, Rothamsted Research, was up first on their Open Access Farm Platform. North Wyke Farm in Devon has been equipped with a variety of sensors and instruments to understand the effects of different farming practices. Of particular interest to me was their analysis of run-off, weather monitoring and soil moisture every 15 minutes; this is a great resource for satellite product validation.

I was up next talking about Earth Observation (EO) Satellite Data for AgriTech. Having seen people overpromise and oversell EO data too many times, I began with getting people to think about what they were trying to achieve, before looking at the technology. The circle of starting questions, on the right, is how I begin with potential clients. If satellite EO is the right technology from these answers, then you can start considering the combinations of both optical/microwave data and free-to-access and commercial data. I went on to show the different types of satellite imagery and what the difference in spatial resolution looks like within an agriculture setting.

I was followed by Vladimir Stolikovic, Satellite Applications Catapult, who focused on the Internet of Things and how it’s important to have sensor network data collected and communicated, with satellite broadband being used in conjunction with mobile phones and WiFi coverage.

Our last talk was by Dr Karen Anderson, University of Exeter, who looked at how drones can capture more than imagery. I was particularly intrigued by the ‘structure from motion photogrammetry’ technique which allows heights to be determined from multiple images; such that for a much lower cost, you can create something similar to what is acquired from a Lidar or laser scanning instrument. Also, by focusing on extracting height, data can be collected in conditions where there’s variable amounts of light, such as under clouds, and it doesn’t requirement high accuracy radiometric calibration.

After coffee, case studies were presented on farming applications:

  • VirtualVet – Collecting data on animal health and drug use digitally, via mobile apps, so paper records don’t become out of data and data can be collated to gain greater insights.
  • Steve Chapman, SC Nutrition Ltd, talked about improving milk production by making sure dried food is optimally prepared – large pieces of dried sweetcorn are digested less well, and a lower nutritional value is extracted from them.
  • The delightfully named, Farm Crap App from FoAM Kernow, aims to encourage farmers to spread manure rather than use artificial fertilizer. Farmers tended to go for the latter as it is easier to calculate the effects, and so having advice, regulations and the important calculations in a phone app, rather than in paper tables, should help them use manure.
  • Caterina Santachiara, ABACO, describing their siti4FARMER solution which is a cloud-computing based platform that includes data which scales from the field to farm and large land areas, with individual customisation so that users can easily see what they need to know.
  • Finally, Glyn Jones from AVANTI, talked about how farmers can stay connected to the internet, and tech support, while out in their fields. This sounds straightforward, but none of the current technologies work well enough – mainly due to the fact that fields aren’t flat! So a new technological area of investigation is ‘white space’ – these are frequencies allocated to broadcasting services, but left unused in particular geographical locations as buffers. The availability varies from location to location, but it is available to lower-powered devices.

After lunch, there were some presentations on Agritech funding opportunities from Innovate UK, AgriTech Cornwall and the South West Centre of Excellence in Satellite Applications. The day concluded with a facilitated session where small groups explored a variety of different ideas in more detail.

It was a really good day, and shows that there is real potential for AgriTech to grow in the South West.

Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.

UK Space Conference Getting Ready For Take Off

Next week we’ll be in Manchester at the 2017 UK Space Conference.

The UK Space Conference is held every two years, and attracted over 1,000 delegates and over 100 exhibitors when held in Liverpool in 2015. It is a key event that brings together the UK Space Community and this year is taking place over three days, 30th May to the 1st June.

We are exhibiting on stand C7, near the centre of the hall, where you’ll be able to come and talk to us about our products and services including:

  • Atmospheric correction
  • Consultancy services
  • Education & training
  • Flood mapping
  • Ocean colour
  • Spatial analyses & data management
  • Terrestrial vegetation
  • Turbidity mapping

We’re also delighted to announce that our Flood Mapping work is one of the products highlighted in the Innovation Zone, which is sponsored by Innovate UK. It is a low cost floodwater mapping product based on Sentinel-1 radar data, which provides easy to understand flood information and maps through an online portal without the need for specialist knowledge. We have partnered with Harris Geospatial Solutions to provide a fully automated solution.

We’ll also have copies of our book for sale, ‘Practical Handbook of Remote Sensing’. This takes complete novices through the process of finding, downloading, processing, visualising and applying remote sensing satellite data using their own PC, open-source software and a standard internet connection.

The 2017 UK Space Conference itself begins on the Tuesday morning with ‘Space 101’, which is a series of workshops covering some of the key issues related to working in the space sector. The conference then kicks off at lunchtime on the Tuesday with an opening plenary on the latest developments in the UK space sector.

There is a networking event in the Exhibition Hall between 6pm and 9pm on Tuesday evening, and we’ll be on our stand all evening.

Wednesday is brimming over with workshops, presentations, plenary and poster sessions, culminating in the Gala Dinner and Sir Arthur Clarke Awards. Finally, Thursday has another busy day of workshops and plenary sessions, before the Conference closes in the afternoon.

We’re really excited about being in Manchester next week, and looking forward to meeting old and new friends.

We hope that any of you who at the Conference will come up and say hello! We’d love to meet you!

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Beware of the Bluetooth Gnomes and Other Stories from GISRUK 2017

Gorton Monastry, GISRUK 2017

The 2017 GIS Research UK (GISRUK) Conference took place last week in Manchester, and Pixalytics sponsored the Best Early-Career Researcher Prize.

I was looking forward to the event, but I nearly didn’t get there! I was planning to catch the train up from London on Wednesday. However, the trackside fire at Euston station put paid to that, as my train was cancelled. Instead I was at the station bright and early on Thursday morning.

The first presentation I saw was the inspiring keynote by Professor Andrew Hudson-Smith. He talked about ‘getting work out there and used’ and using the Internet of Things to create a ‘census of now’ i.e., rather than having census data a number of years out-of-date, collect it all of the time. Personally, I also enjoyed hearing about his Bluetooth gnomes in Queen Elizabeth Olympic Park, which talk to you about cyber security. A visit to his gnomes is definitely on my list for the next spare weekend in London!

I spent the rest of the afternoon in the Infrastructure stream of presentations where there were talks on spatially modelling the impact of hazards (such as flooding) on the National Grid network, human exposure to hydrocarbon pollution in Nigeria, deciding where to site, and what type of, renewable energy and investigating taxi journeys.

In the evening, the conference dinner was at ‘The Monastery’, also known as Gorton Monastery. Despite the name, it was actually a friary built by the Franciscan monks who travelled to Manchester in 1861 to serve the local Catholic community. It was one of the first churches to be completed by the Franciscans in England after the Reformation. It became derelict in the mid 1990’s and ended up on the World Monuments Fund Watch List of 100 Most Endangered Sites in the World. Since then it has been restored and is used as a spectacular community venue.

Friday started with the morning parallel sessions, and I picked ‘Visualisation’ followed by ‘Machine Learning’. Talks included ‘the Curse of Cartograms’ (and if you don’t know what these curses are, have a look here!), land-use mapping and tracking behaviour at music festivals using mobile phone generated data – which won the best spatial analysis paper. However, my favourite talk was given by Gary Priestnall on the projection augmented relief models, which use physical models of a location’s terrain that are then overlaid with imagery/videos shown using a projector. The effect was fantastic!

Our closing keynote, ‘The Great Age of Geography 2017’, was from Nick Crane, known to UK TV viewers as the ‘map man’. He reflected on the role of geographers throughout history and then into the future. He equated the breakthrough in printing, from wood blocks to copper plates that could be engraved in more detail and updated, to today’s transition from analogue to digital.

The conference finished with the awards. I was delighted to present Alyson Lloyd and James Cheshire with the Best Early-Career Researcher Prize for their presentation on ‘Challenges of Big Data for Social Science: Addressing Uncertainty in Loyalty Card Data’. Unfortunately, as it was on Wednesday afternoon, it wasn’t one I’d seen personally. However, I’ve downloaded the conference paper, available from here, and I’m look forward to reading it.

It was an excellent conference, and I was really enjoyed my time in Manchester. Looking forward to GISRUK 2018!