Inspiring the Next Generation of EO Scientists

Artist's rendition of a satellite - 3dsculptor/123RF Stock Photo

Artist’s rendition of a satellite – 3dsculptor/123RF Stock Photo

Last week, whilst Europe’s Earth Observation (EO) community was focussed on the successful launch of Sentinel-5P, over in America Tuesday 10th October was Earth Observation Day!

This annual event is co-ordinated by AmericaView, a non-profit organisation, whose aim to advance the widespread use of remote sensing data and technology through education and outreach, workforce development, applied research, and technology transfer to the public and private sectors.

Earth Observation Day is a Science, Technology, Engineering, and Mathematics (STEM) event celebrating the Landsat mission and its forty-five year archive of imagery. Using satellite imagery provides valuable experience for children in maths and sciences, together with introducing subjects such as land cover, food production, hydrology, habitats, local climate and spatial thinking. The AmericaView website contains a wealth of EO materials available for teachers to use, from fun puzzles and games through to a variety of remote sensing tutorials. Even more impressive is that the event links schools to local scientists in remote sensing and geospatial technologies. These scientists provide support to teachers including giving talks, helping design lessons or being available to answer student’s questions.

This is a fantastic event by AmericaView, supporting by wonderful resources and remote sensing specialists. We first wrote about this three years ago, and thought the UK would benefit from something similar. We still do. The UK Space Agency recently had an opportunity for organisations interested in providing education and outreach activities to support EO, satellite launch programme or the James Webb Space Telescope. It will be interesting to see what the successful candidates come up with.

At Pixalytics we’re passionate about educating and inspiring the next generation of EO scientists. For example, we regularly support the Remote Sensing and Photogrammetry Society’s Wavelength conference for students and early career scientists; and sponsored the Best Early-Career Researcher prize at this year’s GISRUK Conference. We’re also involved with two exciting events at Plymouth’s Marine Biological Association, a Young Marine Biologists (YMB) Summit for 12-18 year olds at the end of this month and their 2018 Postgraduate conference.

Why is this important?
The space industry, and the EO sector, is continuing to grow. According to Euroconsult’s ‘Satellites to Be Built & Launched by 2026’ – I know this is another of the expensive reports we highlighted recently – there will be around 3,000 satellites with a mass above 50 kg launched in the next decade – of which around half are anticipated as being used for EO or communication purposes. This almost doubles the number of satellites launched in the last ten years and doesn’t include the increasing number of nano and cubesats going up.

Alongside the number of satellites, technological developments mean that the amount of EO data available is increasing almost exponentially. For example, earlier this month World View successfully completed multi-day flight of its Stratolliteâ„¢ service, which uses high-altitude balloons coupled with the ability to steer within stratospheric winds. They can carry a variety of sensors, a mega-pixel camera was on the recent flight, offering an alternative vehicle for collecting EO data.

Therefore, we need a future EO workforce who are excited, and inspired, by the possibilities and who will take this data and do fantastic things with it.

To find that workforce we need to shout about our exciting industry and make sure everyone knows about the career opportunities available.

Supporting Soil Fertility From Space

Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. Sentinel data courtesy of ESA/Copernicus.

Last Tuesday I was at the academic launch event for the Tru-Nject project at Cranfield University. Despite the event’s title, it was in fact an end of project meeting. Pixalytics has been involved in the project since July 2015, when we agreed to source and process high resolution satellite Earth Observation (EO) imagery for them.

The Tru-Nject project is funded via Innovate UK. It’s official title is ‘Tru-Nject: Proximal soil sensing based variable rate application of subsurface fertiliser injection in vegetable/ combinable crops’. The focus is on modelling soil fertility within fields, to enable fertiliser to be applied in varying amounts using point-source injection technology which reduces the nitrogen loss to the atmosphere when compared with spreading fertiliser on the soil surface.

To do this the project created soil fertility maps from a combination of EO products, physical sampling and proximal soil sensing – where approximately 15 000 georeferenced hyperspectral spectra are collected using an instrument connected to a tractor. These fertility maps are then interpreted by an agronomist, who decides on the relative application of fertiliser.

Initial results have shown that applying increased fertiliser to areas of low fertility improves overall yield when compared to applying an equal amount of fertiliser everywhere, or applying more fertiliser to high yield areas.

Pixalytics involvement in the work focussed on acquiring and processing, historical, and new, sub 5 metre optical satellite imagery for two fields, near Hull and York. We have primarily acquired data from the Kompsat satellites operated by the Korea Aerospace Research Institute (KARI), supplemented with WorldView data from DigitalGlobe. Once we’d acquired the imagery, we processed it to:

  • remove the effects of the atmosphere, termed atmospheric correction, and then
  • converted them to maps of vegetation greenness

The new imagery needed to coincide with a particular stage of crop growth, which meant the satellite data acquisition period was narrow. This led to a pleasant surprise for Dave George, Tru-Nject Project Manager, who said, “I never believed I’d get to tell a satellite what to do.’ To ensure that we collected data on specific days we did task the Kompsat satellites each year.

Whilst we were quite successful with the tasking the combination of this being the UK, and the fact that the fields were relatively small, meant that some of the images were partly affected by cloud. Where this occurred we gap-filled with Copernicus Sentinel-2 data, it has coarser spatial resolution (15m), but more regular acquisitions.

In addition, we also needed to undertake vicarious adjustment to ensure that we produced consistent products over time whilst the data came from different sensors with different specifications. As we cannot go to the satellite to measure its calibration, vicarious adjustment is a technique which uses ground measurements and algorithms to not only cross-calibrate the data, but also adjusts for errors in the atmospheric correction.

An example of the work is at the top, which shows a Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. The greener the NDVI product the more green the vegetation is, although the two datasets were collected in different years so the planting within the field varies.

We’ve really enjoyed working with Stockbridge Technology Centre Ltd (STC), Manterra Ltd, and Cranfield University, who were the partners in the project. Up until last week all the work was done via telephone and email, and so it was great to finally meet them in-person, hear about the successful project and discuss ideas for the future.

Landsat Turns 45!

False colour image of Dallas, Texas. The first fully operational Landsat image taken on July 25, 1972, Image courtesy: NASA’s Earth Observatory

Landsat has celebrated forty-five years of Earth observation this week. The first Landsat mission was Earth Resources Technology Satellite 1 (ERTS-1), which was launched into a sun-synchronous near polar orbit on the 23 July 1972. It wasn’t renamed Landsat-1 until 1975. It had an anticipated life of 1 year and carried two instruments: the Multi Spectral Scanner (MSS) and the Return-Beam Vidicon (RBV).

The Landsat missions have data continuity at their heart, which has given a forty-five year archive of Earth observation imagery. However, as technological capabilities have developed the instruments on consecutive missions have improved. To demonstrate and celebrate this, NASA has produced a great video showing the changing coastal wetlands in Atchafalaya Bay, Louisiana, through the eyes of the different Landsat missions.

In total there have been eight further Landsat missions, but Landsat 6 failed to reach its designated orbit and never collected any data. The missions have been:

  • Landsat 1 launched on 23 July 1972.
  • Landsat 2 launched on 22 January 1975.
  • Landsat 3 was launched on 5 March 1978.
  • Landsat 4 launched on 16 July 1982.
  • Landsat 5 launched on 1 March 1984.
  • Landsat 7 launched on 15 April 1999, and is still active.
  • Landsat 8 launched on 11 February 2013, and is still active.

Landsat 9 is planned to be launched at the end 2020 and Landsat 10 is already being discussed.

Some of the key successes of the Landsat mission include:

  • Over 7 million scenes of the Earth’s surface.
  • Over 22 million scenes had been downloaded through the USGS-EROS website since 2008, when the data was made free-to-access, with the rate continuing to increase (Campbell 2015).
  • Economic value of just one year of Landsat data far exceeds the multi-year total cost of building, launching, and managing Landsat satellites and sensors.
  • Landsat 5 officially set a new Guinness World Records title for the ‘Longest-operating Earth observation satellite’ with its 28 years and 10 months of operation when it was decommissioned in December 2012.
  • ESA provides Landsat data downlinked via their own data receiving stations; the ESA dataset includes data collected over the open ocean, whereas USGS does not, and the data is processed using ESA’s own processor.

The journey hasn’t always been smooth. Although established by NASA, Landsat was transferred to the private sector under the management of NOAA in the early 1980’s, before returning to US Government control in 1992. There have also been technical issues, the failure of Landsat 6 described above; and Landsat 7 suffering a Scan Line Corrector failure on the 31st May 2003 which means that instead of mapping in straight lines, a zigzag ground track is followed. This causes parts of the edge of the image not to be mapped, giving a black stripe effect within these images; although the centre of the images is unaffected the data overall can still be used.

Landsat was certainly a game changer in the remote sensing and Earth observation industries, both in terms of the data continuity approach and the decision to make the data free to access. It has provided an unrivalled archive of the changing planet which has been invaluable to scientists, researchers, book-writers and businesses like Pixalytics.

We salute Landsat and wish it many more years!

If no-one is there when an iceberg is born, does anyone see it?

Larsen C ice Shelf including A68 iceberg. Image acquired by MODIS Aqua satellite on 12th July 2017. Image courtesy of NASA.

The titular paraphrasing of the famous falling tree in the forest riddle was well and truly answered this week, and shows just how far satellite remote sensing has come in recent years.

Last week sometime between Monday 10th July and Wednesday 12th July 2017, a huge iceberg was created by splitting off the Larsen C Ice Shelf in Antarctica. It is one of the biggest icebergs every recorded according to scientists from Project MIDAS, a UK-based Antarctic research project, who estimate its area of be 5,800 sq km and to have a weight of more a trillion tonnes. It has reduced the Larsen C ice Shelf by more than twelve percent.

The iceberg has been named A68, which is a pretty boring name for such a huge iceberg. However, icebergs are named by the US National Ice Centre and the letter comes from where the iceberg was originally sited – in this case the A represents area zero degrees to ninety degrees west covering the Bellingshausen and Weddell Seas. The number is simply the order that they are discovered, which I assume means there have been 67 previous icebergs!

After satisfying my curiosity on the iceberg names, the other element that caught our interest was the host of Earth observation satellites that captured images of either the creation, or the newly birthed, iceberg. The ones we’ve spotted so far, although there may be others, are:

  • ESA’s Sentinel-1 has been monitoring the area for the last year as an iceberg splitting from Larsen C was expected. Sentinel-1’s SAR imagery has been crucial to this monitoring as the winter clouds and polar darkness would have made optical imagery difficult to regularly collect.
  • Whilst Sentinel-1 was monitoring the area, it was actually NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite which confirmed the ‘birth’ on the 12th July with a false colour image at 1 km spatial resolution using band 31 which measures infrared signals. This image is at the top of the blog and the dark blue shows where the surface is warmest and lighter blue indicates a cooler surface. The new iceberg can be seen in the centre of the image.
  • Longwave infrared imagery was also captured by the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite on July 13th.
  • Similarly, NASA also reported that Landsat 8 captured a false-colour image from its Thermal Infrared Sensor on the 12th July showing the relative warmth or coolness of the Larsen C ice shelf – with the area around the new iceberg being the warmest giving an indication of the energy involved in its creation.
  • Finally, Sentinel-3A has also got in on the thermal infrared measurement using the bands of its Sea and Land Surface Temperature Radiometer (SLSTR).
  • ESA’s Cryosat has been used to calculate the size of iceberg by using its Synthetic Aperture Interferometric Radar Altimeter (SIRAL) which measured height of the iceberg out of the water. Using this data, it has been estimated that the iceberg contains around 1.155 cubic km of ice.
  • The only optical imagery we’ve seen so far is from the DEMIOS1 satellite which is owned by Deimos Imaging, an UrtheCast company. This is from the 14th July and revealed that the giant iceberg was already breaking up into smaller pieces.

It’s clear this is a huge iceberg, so huge in fact that most news agencies don’t think that readers can comprehend its vastness, and to help they give a comparison. Some of the ones I came across to explain its vastness were:

  • Size of the US State of Delaware
  • Twice the size of Luxembourg
  • Four times the size of greater London
  • Quarter of the size of Wales – UK people will know that Wales is almost an unofficial unit of size measurement in this country!
  • Has the volume of Lake Michigan
  • Has the twice the volume of Lake Erie
  • Has the volume of the 463 million Olympic-sized swimming pools; and
  • My favourite compares its size to the A68 road in the UK, which runs from Darlington to Edinburgh.

This event shows how satellites are monitoring the planet, and the different ways we can see the world changing.

Locusts & Monkeys

Soil moisture data from the SMOS satellite and the MODIS instrument acquired between July and October 2016 were used by isardSAT and CIRAD to create this map showing areas with favourable locust swarming conditions (in red) during the November 2016 outbreak. Data courtesy of ESA. Copyright : CIRAD, SMELLS consortium.

Spatial resolution is a key characteristic in remote sensing, as we’ve previously discussed. Often the view is that you need an object to be significantly larger than the resolution to be able to see it on an image. However, this is not always the case as often satellites can identify indicators of objects that are much smaller.

We’ve previously written about satellites identifying phytoplankton in algal blooms, and recently two interesting reports have described how satellites are being used to determine the presence of locusts and monkeys!

Locusts

Desert locusts are a type of grasshopper, and whilst individually they are harmless as a swarm they can cause huge damage to populations in their paths. Between 2003 and 2005 a swarm in West Africa affected eight million people, with reported losses of 100% for cereals, 90% for legumes and 85% for pasture.

Swarms occur when certain conditions are present; namely a drought, followed by rain and vegetation growth. ESA and the UN Food and Agriculture Organization (FAO) have being working together to determine if data from the Soil Moisture and Ocean Salinity (SMOS) satellite can be used to forecast these conditions. SMOS carries a Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) instrument – a 2D interferometric L-band radiometer with 69 antenna receivers distributed on a Y-shaped deployable antenna array. It observes the ‘brightness temperature’ of the Earth, which indicates the radiation emitted from planet’s surface. It has a temporal resolution of three days and a spatial resolution of around 50 km.

By combining the SMOS soil moisture observations with data from NASA’s MODIS instrument, the team were able to downscale SMOS to 1km spatial resolution and then use this data to create maps. This approach then predicted favourable locust swarming conditions approximately 70 days ahead of the November 2016 outbreak in Mauritania, giving the potential for an early warning system.

This is interesting for us as we’re currently using soil moisture data in a project to provide an early warning system for droughts and floods.

Monkeys

Earlier this month the paper, ‘Connecting Earth Observation to High-Throughput Biodiversity Data’, was published in the journal Nature Ecology and Evolution. It describes the work of scientists from the Universities of Leicester and East Anglia who have used satellite data to help identify monkey populations that have declined through hunting.

The team have used a variety of technologies and techniques to pull together indicators of monkey distribution, including:

  • Earth observation data to map roads and human settlements.
  • Automated recordings of animal sounds to determine what species are in the area.
  • Mosquitos have been caught and analysed to determine what they have been feeding on.

Combining these various datasets provides a huge amount of information, and can be used to identify areas where monkey populations are vulnerable.

These projects demonstrate an interesting capability of satellites, which is not always recognised and understood. By using satellites to monitor certain aspects of the planet, the data can be used to infer things happening on a much smaller scale than individual pixels.

Blue Holes from Space

Andros Island in The Bahamas. Acquired by Landsat 8 in February 2017. Data courtesy of NASA.

Blue holes are deep marine caverns or sinkholes which are open at the surface, and they get their name from their apparent blue colour of their surface due to the scattering of the light within water. The often contain both seawater and freshwater, and in their depths the water is very clear which makes them very popular with divers.

The term ‘blue hole’ first appeared on sea charts from the Bahamas in 1843, although the concept of submarine caves had been described a century earlier (from Schwabe and Carew, 2006). There are a number of well-known blue holes in Belize, Egypt and Malta amongst others. The Dragon Hole in the South China Sea is believed to be the deepest blue hole with a depth of 300 metres.

The Andros Island in The Bahamas has the highest concentration of blue holes in the world, and last week we watched a television programme called River Monsters featuring this area. The presenter, Jeremy Wade, was investigating the mythical Lusca, a Caribbean sea creature which reportedly attacks swimmers and divers pulling them down to their lairs deep within of the blue holes. Jeremy fished and dived some blue holes, and spoke to people who had seen the creature. By the end he believed the myth of the Lusca was mostly likely based on a giant octopus. Whilst this was interesting, by the end of the programme we were far more interested in whether you could see blue holes from space.

The image at the top is Andros Island. Although, technically it’s an archipelago, it is considered as a single island. It’s the largest island of The Bahamas and at 2,300 square miles is the fifth largest in the Caribbean. There are a number of well known blue holes in Andros, both inland and off the coast, such as:

Blues in the Blue Hole National Park on the Andros Island in The Bahamas. Acquired by Landsat 8 in February 2017. Data courtesy of NASA.

  • Blue Holes National Park covers over 33,000 acres and includes a variety of blue holes, freshwater reservoirs and forests within its boundaries. The image to the right covers an area of the national park. In the centre, just above the green water there are five black circles  – despite the colour, these are blue holes.
  • Uncle Charlie’s Blue Hole, also called Little Frenchman Blue Hole, is just off Queen’s Highway in Nicholls Town and has a maximum depth of 127 metres.
  • Atlantis Blue Hole has a maximum depth of about 85 metres.
  • Stargate Blue Hole his blue hole is located about 500 miles inland from the east coast of South Andros on the west side of The Bluff village.
  • Guardian Blue Hole is in the ocean and is believed to have the second deepest cave in The Bahamas, with a maximum explored depth of 133 metres.

Blue hole in the south of Andros Island in The Bahamas. Acquired by Landsat 8 in February 2017. Data courtesy of NASA.

The image to the right is from the south of the island. Just off the centre, you can see a blue hole surrounded by forests and vegetation.

So we can confirm that the amazing natural features called blue holes can be seen from space, even if they don’t always appear blue!

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

China’s Geo-Information Survey

Yuqiao Reservoir, east of Beijing, China from Landsat 8 acquired March 2017. Data courtesy of NASA/USGS,.

The first national geo-information study of China was released last week at a State Council Information Office press briefing.

The study, also referred to as the national census of geographic conditions, was originally announced in March 2013. Over the last three years 50,000 professionals have been involved in collecting a variety of data about China and it’s reported that they have achieved a 92% coverage of the country, generating around 770 terabytes of data in the process.

Data has been collected on natural resources, such as land features, vegetation, water and deserts; together with urban resources such as transport infrastructure, towns and neighbourhoods. This information was gathered, and verified, through remote sensing satellites, drones, aerial photography, 3D laser scanning and in-situ data. It’s reported that the accuracy is 99.7% with a 1 m resolution.

China is one of the largest countries in the world by land mass, at approximately 9.6 m square kilometres. Therefore, simply completing such a study with the accuracy and resolution reported is highly impressive.

It may take years to fully appreciate the variety, size and usefulness of this new dataset. However, a number of interesting high level statistics have already been released by the Chinese Ministry of Land and Resources including:

  • 23.2% of China’s land is above 3,500 m altitude, and 43.4% is below 1,000 m altitude.
  • 7.57 million sq km of the country has vegetation cover, with 21.1% being cultivated lands and the remainder grasslands and forests.
  • 1.3 million sq km of land is desert and bare lands, whilst rivers cover 6.55 million sq km.
  • 153,000 sq km of land has buildings on it.
  • 116,500 sq km of railway track and there is 2 million sq km of roads.

According to Kurex Mexsut, deputy head of the National Administration of Surveying, Mapping and Geoinformation, the Chinese Government will be looking to establish a data sharing mechanism and information services platform for this dataset, together with a variety of data products. It is hoped that public departments and companies will be able to use this to help improve the delivery of public services.

Although not from the survey, the image at the top is of the Yuqiao Reservoir, situated just to the east of Beijing. It has a surface area of 119 sq km, with an average depth of 14 metres.

Not only is this a comprehensive geo-information dataset for a single country, but there is also huge potential for further information to be derived from this dataset. We’ll be watching with interest to see how the data is used and the impact it has.

Three Exciting Ways to Protect Forests With Remote Sensing

Forests cover one third of the Earth’s land mass and are home to more than 80% of the terrestrial species of animals, plants and insects. However, 13 million hectares of forest are destroyed each year. The United Nations International Day of Forests took place recently, on 21st March, to raise awareness of this vital resource.

Three remote sensing applications to help protect forests caught our eye recently:

Two scans show the difference between infected, on the right, and uninfected, on the left, patches of forest. Image Courtesy of University of Leiceste

Identifying Diseased Trees
In the March issue of Remote Sensing, researchers from the University of Leicester, (Barnes et al, 2017), published a paper entitled ‘Individual Tree Crown Delineation from Airborne Laser Scanning for Diseased Larch Forest Stands’. It describes how the researchers were able to identify individual trees affected by larch tree disease, also known as phytophthora ramorum.

This fungus-like disease can cause extensive damage, including the death, and diseased trees can be identified by defoliation and dieback. Airborne LiDAR surveys were undertaken by the company Bluesky at an average altitude of 1500 m, with a scan frequency of 66 Hz that gave a sensor range precision within 8 mm and elevation accuracy around 3–10 cm.

Remote sensing has been used to monitor forests for many years, but using it to identify individual trees is uncommon. The researchers in this project were able to successfully identify larch canopies partially or wholly defoliated by the disease in greater than 70% of cases. Whilst further development of the methodology will be needed, it is hoped that this will offer forest owners a better way of identifying diseased trees and enable them to respond more effectively to such outbreaks.

Monitoring Trees From Space
An interesting counterpoint to work of Barnes et al (2017) was published by the journal Forestry last month. The paper ‘Estimating stand density, biomass and tree species from very high resolution stereo-imagery – towards an all-in-one sensor for forestry applications‘ written by Fassnacht et al (2017).

It describes work undertaken to compare the results of very high resolution optical satellite data with that of airborne LiDAR and hyperspectral data to provide support for forestry management. The team used WorldView-2 images, of a temperate mixed forest in Germany, with a 2m pixel size, alongside a LiDAR DTM with a 1 m pixel size. This data was then used to estimate tree species, forest stand density and biomass.

They found  good results for both forest stand density and biomass compared to other methods, and although the tree classification work did achieve over eighty percent, this was less than achieved by hyperspectral data over the same site; although differentiation of broadleaved and coniferous trees was almost perfect.

This work shows that whilst further work is needed, optical data has the potential to offer a number of benefits for forestry management.

Monitoring Illegal Logging
Through the International Partnership Programme the UK Space Agency is funding a consortium, led by Stevenson Astrosat Ltd, who will be using Earth Observation (EO) data to monitor, and reduce, illegal logging in Guatemala.

The issue has significant environmental and socioeconomic impacts to the country through deforestation and change of land use. The Guatemalan government have made significant efforts to combat the problem, however the area to be monitored is vast. This project will provide a centralised system using EO satellite data and Global Navigation Satellite Systems (GNSS) technology accessed via mobile phones or tablets to enable Guatemala’s National Institute of Forestry (INAB) to better track land management and identify cases of illegal logging.

Overall
The protection of our forests is critical to the future of the planet, and it’s clear that satellite remote sensing can play a much greater role in that work.

Catching Wavelength 2017

Remote sensing, like GIS, excels in integrating across disciplines and people. Whilst no one ever said being a multi-disciplinary scientist was going to be easy, for the ‘thirsty’ mind it challenges, cross pollinates ideas and looks at problems with new eyes. A diverse group of people connected by a common thread of spatial and remotely sensed data found themselves doing all these things and more in London last week at the Wavelength 2017 conference.

The talks and posters took us on whirlwind tour through the ever varying landscape of remote sensing. We moved through subject areas ranging from detecting ground ice, vegetation and overall land cover, through to earth surface movement and 3D imaging, and onto agriculture yield and drought. We also covered the different vertical scales from which remotely sensed data is collected, whether from satellites, planes, drones or cameras operated from ground level. On top of this focus we also had some great key note talks, running through the varied career of a remote sensing scientist (Groeger Ltd), as well as in depth data assimilation of remote sensing imagery in models (UCL) and commercial developments in airborne camera work (Geoxphere Ltd).

In parallel, we were taken on a grand tour covering the temperate UK, parts of the Middle East, the tundra in North America, the central belt of Africa, and even onto the moon and Mars! In many cases we heard talks from scientists from these countries (though not the moon or Mars …). Some are based at the universities in the UK, whilst, others came specifically to talk at the conference.

I found myself transfixed by the far flung places. Listening to how the dark side of the moon is being mapped, a place that never sees daylight and is incredibly ‘chilly’ and traps ice in these shadowed lands. I also heard about the CO2 that precipitates out of the atmosphere on Mars as snow and forms a 1m blanket. Working in places like Africa started to feel really quite local and accessible!

Possibly the most intriguing aspect of the conference for me, was the advancements that have been made in photogrammetry and how multiple photos are now being used to produce highly intricate 3D models. We saw this applied to cliff morphology and change detection, as well as the 3D point clouds that are produced when modelling trees and vegetation generally.

The 3D models aren’t totally complete due to line of sight and other issues. The model visualisations look like an impressionist painting to me, with tree leaves without trunks or clumps of green mass suspended in mid air. However, this does not matter when calculating leaf volume and biomass, as these discrepancies can be worked with and lead to some very useful estimates of seasonality and change.

Setting this up is no small feat for the organiser, and PhD student, James O’Connor. He delivered an interesting programme and looked after the delegates well. I can truly say I haven’t been to such a friendly conference before. It was also unique in providing ample time to discuss aspects of material presented, both from talks and posters, and sharing technical know-how. This felt of real value, especially to the PhD students and young professionals this conference is geared towards, but equally myself with experience in only certain fields of remote sensing.

I would highly recommend Wavelength, and look forward to seeing what they are planning for 2018!

Blog written by Caroline Chambers, Pixalytics Ltd.