Sentinel 3B Sets Forth

Copernicus Sentinel-3B satellite in its rocket ready to go to the launch pad.
Image courtesy of ESA–S. Corvaja.

The latest Sentinel satellite in the Copernicus programme, 3B, launched on the 25th April from the Plesetsk Cosmodrome in Russia. Lift-off was at 18.57 (BST) and you could have watched the event live on the ESA broadcast.

Sentinel-3B is the twin to Sentinel-3A which was launched on the 16th February 2016. It has a launch weight of approximately 1 250 kg and following a flight of just under one and half hours, it should go into a near polar sun-synchronous orbit at an 814 km altitude.

The twin satellites are identical and carry four scientific instruments:

  • Sea and Land Surface Temperature Radiometer (SLSTR) will measure temperatures of both the sea and land, to an accuracy of better than 0.3 K. This instrument has 9 spectral bands with a spatial resolution of 500 m for visible/near-infrared wavelengths and 1 km for the thermal wavelengths; and has swath widths of 1420 km at nadir and 750 km looking backwards. It’s worth noting that two thermal infrared spectral wavebands are optimised for fire detection, providing the fire radiative power measurement.
  • Ocean and Land Colour Instrument (OLCI) has 21 spectral bands (400–1020 nm) focussed on ocean colour and vegetation measurements. All bands have a spatial resolution of 300 m with a swath width of 1270 km.
  • Synthetic Aperture Radar Altimeter (SRAL) which has dual frequency Ku and C bands. It offers 300 m spatial resolution after SAR processing, and is based on the instruments from the CryoSat and Jason missions.
  • Microwave Radiometer (MWR) dual frequency at 23.8 & 36.5 GHz, it is used to derive atmospheric column water vapour measurements for correcting the SRAL instrument.

Once in orbit the two satellites will be separated by 140 degrees which will allow them to offer short revisit times – less than two days for the OLCI and less than a day at the equator for the SLSTR. The operational life of the satellite is seven years.

Italy and the Mediterranean captured by Sentinel-3A on the 28 September 2016.
Image, courtesy of and, contains modified Copernicus Sentinel data (2016), processed by ESA, CC BY-SA 3.0 IGO.

Sentinel-3 is generally considered to be an ocean and coastal monitoring mission and its measurements include sea-surface height, sea surface temperature, ocean colour, surface wind speed, sea ice thickness and ice sheets. In the image to the left, it is interesting to note the sediment in water on the east coast of the Italy, in contrast to the mostly sediment free west coast.

As you can see from this image in addition to its primary focus on water, Sentinel-3 also provides measurements over land which includes the heights of rivers and lakes, water quality indicators, land cover change, vegetation indices and monitoring wildfires.

This is the seventh satellite in the Copernicus programme launched since 2014, and will complete the trio of twin satellites following the radar imaging Sentinel-1A & 1B and the optical imaging Sentinel 2A & 2B. The seventh satellite the singular Sentinel-5P which measures the atmosphere, although there a number of further Sentinel missions already planned. All the data from these satellites is free to access to anyone with a computer and a decent internet connection. You can download the data yourself, although there are an increasing number of websites online that will do a lot of the basic processing and visualising for you, meaning all you need to do is pick what you want to investigate. This is great for people new to satellite data and it enables them to get involved with Copernicus data without the need for any specialist skills.

This programme also offers companies, like Pixalytics, the opportunity to develop a range of products and services based on the data. We already have products using Sentinel-1, and are in the process of developing ones with Sentinel-2 and Sentinel-3. Exciting times in Earth observation!

We wish Sentinel-3B well as it sets forth on its journey!

Monitoring Water Quality from Space

Algal Blooms in Lake Erie, around Monroe, acquired by Sentinel-2 on 3rd August 2017. Data Courtesy of ESA/Copernicus.

Two projects using Earth Observation (EO) data to monitor water quality caught our eye recently. As we’re in process of developing two water quality products for our own online portal, we’re interested in what everyone else is doing!

At the end of January UNESCO’s International Hydrological Programme launched a tool to monitor global water quality. The International Initiative on Water Quality (IIWQ) World Water Quality Portal, built by EOMAP, provides:

  • turbidity and sedimentation distribution
  • chlorophyll-a concentration
  • Harmful Algal Blooms indicator
  • organic absorption
  • surface temperature

Based on optical data from Landsat and Sentinel-2 it can provide global surface water mosaics at 90 m spatial resolution, alongside 30 m resolution for seven pilot river basins.  The portal was launched in Paris at the “Water Quality Monitoring using Earth Observation and Satellite-based Information” meeting and was accompanied by an exhibition on “Water Quality from the Space – Mesmerizing Images of Earth Observation”.

The tool, which can be found here, focuses on providing colour visualizations of the data alongside data legends to help make it as easy as possible to use. It is hoped that this will help inform and educate policy makers, water professionals and the wider public about the value of using satellite data from monitoring water resources.

A second interesting project, albeit on a smaller scale, was announced last week which is going to use Sentinel-2 imagery to monitor water quality in Scottish Lochs. Dr Claire Neil, from the University of Stirling, is leading the project and will be working with Scottish Environment Protection Agency. It will use reflectance measures to estimate the chlorophyll-a concentrations to help identify algal blooms and other contaminants in the waters. The project will offer an alternative approach to the current water quality monitoring, which uses sampling close to the water’s edge.

An interesting feature of the project, particularly for us, is the intention to focus on developing this work into an operational capability for SEPA to enable them to improve their approach to assessing water quality.

This transition from a ‘good idea’ into an operational product that will be used, and therefore purchased, by end users is what all EO companies are looking for and we’re not different. Our Pixalytics Portal which we discussed a couple of weeks ago is one of the ways we are trying to move in that direction. We have two water quality monitoring products on it:

  • Open Ocean Water Quality product extracts time-series data from a variety of 4 km resolution satellite datasets from NASA, giving an overview what is happening in the water without the need to download a lot of data.
  • Planning for Coastal Airborne Lidar Surveys product provides an assessment of the penetration depth of a Lidar laser beam, from an airborne survey system, within coastal waters based on the turbidity of the water. This ensures that companies who plan overflights can have confidence in how far their Lidar will see.

We’re just at the starting point in productizing the services we offer, and so it is always good to see how others are approaching the similar problem!

To TEDx Speaking and Beyond!

Back in April I received an invitation to speak at the ‘One Step Beyond’ TEDx event organised at the National Space Centre in Leicester, with my focus on the Blue Economy and Earth Observation (EO).

We’ve been to a few TEDx events in the past and they’ve always been great, and so I was excited to have the opportunity to join this community. Normally, I’m pretty relaxed about public speaking. I spend a lot of time thinking about what I’m going to say, but don’t assemble my slides until a couple of days beforehand. This approach has developed in part because I used to lecture – where I got used to talking for a while with a few slides – but also because I always like to take some inspiration from the overall mood of the event I’m talking at. This can be through hearing other speakers, attending workshops or even just walking around the local area.

TEDx, however, was different. There was a need to have the talk ready early for previewing and feedback, alongside producing stunning visuals and having a key single message. So, for a change, I started with a storyboard.

My key idea was to get across the sense of wonder I and many other scientists share in observing the oceans from space, whilst also emphasising that anyone can get involved in protecting this natural resource. I echoed the event title by calling my talk “Beyond the blue ocean” as many people think of the ocean as just a blue waterbody. However, especially from space, we can see the beauty, and complexity, of colour variations influenced by the microscopic life and substances dissolved and suspended within it.

I began with an with an image called the ‘Pale Blue Dot’ that was taken by Voyager 1 at a distance of more than 4 billion miles from Earth, and then went with well-known ‘Blue Marble’ image before zooming into what we see from more conventional EO satellites. I also wanted to take the audience beyond just optical wavelengths and so displayed microwave imagery from Sentinel-1 that’s at a similar spatial resolution to my processed 15 m resolution Sentinel-2 data that was also shown.

Dr Samantha Lavender speaking at the One Step Beyond TEDx event in Leicester. Photo courtesy of TEDxLeicester

The satellite imagery included features such as wind farms, boats and phytoplankton blooms I intended to discuss. However, this didn’t quite to go to plan on my practice run through! The talk was in the planetarium at the National Space Centre, which meant the screen was absolutely huge – as you can see in the image to the right. However, with the lights on in the room the detail in the images was really difficult to see. The solution for the talk itself was to have the planetarium in darkness and myself picked out by two large spotlights, meaning that the image details were visible to the audience but I couldn’t see the audience myself.

The evening itself took place on the 21st September, and with almost two hundred in the audience I was up first. I was very happy with how it went and the people who spoke to me afterwards said they were inspired by what they’d seen. You can see for yourself, as the talk can be found here on the TEDx library. Let me know what you think!

I was followed by two other fantastic speakers who gave inspiring presentations and these are also up on the TEDx Library. Firstly, Dr Emily Shuckburgh, Deputy Head of Polar Oceans team at British Antarctic Survey discussed “How to conduct a planetary health check”; and she was followed by Corentin Guillo, CEO and Founder of Bird.i, who spoke about “Space entrepreneurship, when thinking outside the box is not enough”.

The whole event was hugely enjoyable and the team at TEDx Leicester did an amazing job of organising it. It was good to talk to people after the event, and it was fantastic that seventy percent of the audience were aged between 16 and 18. We need to do much more of this type of outreach activities to educate and inspire the next generation of scientists. Of course, for me, the day also means that I can now add TEDx Speaker to my biography!

Can You See The Great Wall of China From Space?

Area north of Beijing, China, showing the Great Wall of China running through the centre. Image acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

Dating back over two thousand three hundred years, the Great Wall of China winds its way from east to west across the northern part of the country. The current remains were built during Ming Dynasty and have a length of 8 851.8 km according to 2009 work by the Chinese State Administration of Cultural Heritage and National Bureau of Surveying and Mapping Agency. However, if you take into account the different parts of the wall built by other dynasties, its length is almost twenty two thousand kilometres.

The average height of the wall is between six and seven metres, and its width is between four to five metres. This width would allow five horses, or ten men, to walk side by side. The sheer size of the structure has led people to believe that it could be seen from space. This was first described by William Stukeley in 1754, when he wrote in reference to Hadrian’s Wall that ‘This mighty wall of four score miles in length is only exceeded by the Chinese Wall, which makes a considerable figure upon the terrestrial globe, and may be discerned at the Moon.’

Despite Stukeley’s personal opinion not having any scientific basis, it has been repeated many times since. By the time humans began to go into space, it was considered a fact. Unfortunately, astronauts such as Buzz Aldrin, Chris Hatfield and even China’s first astronaut, Yang Liwei, have all confirmed that the Great Wall is not visible from space by the naked eye. Even Pixalytics has got a little involved in this debate. Two years ago we wrote a blog saying that we couldn’t see the wall on Landsat imagery as the spatial resolution was not small enough to be able to distinguish it from its surroundings.

Anyone who is familiar with the QI television series on the BBC will know that they occasionally ask the same question in different shows and give different answers when new information comes to light. This time it’s our turn!

Last week Sam was a speaker at the TEDx One Step Beyond event at the National Space Centre in Leicester – you’ll hear more of that in a week or two. However, in exploring some imagery for the event we looked for the Great Wall of China within Sentinel-2 imagery. And guess what? We found it! In the image at the top, the Great Wall can be seen cutting down the centre from the top left.

Screenshot of SNAP showing area north of Beijing, China. Data acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

It was difficult to spot. The first challenge was getting a cloud free image of northern China, and we only found one covering our area of interest north of Beijing! Despite Sentinel-2 having 10 m spatial resolution for its visible wavelengths, as noted above, the wall is generally narrower. This means it is difficult to see the actual wall itself, but it is possible to see its path on the image. This ability to see very small things from space by their influence on their surroundings is similar to how we are able to spot microscopic phytoplankton blooms. The image on the right is a screenshot from Sentinel Application Platform tool (SNAP) which shows the original Sentinel-2 image of China on the top left and the zoomed section identifying the wall.

So whilst the Great Wall of China might not be visible from space with the naked eye, it is visible from our artificial eyes in the skies, like Sentinel-2.

Supporting Soil Fertility From Space

Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. Sentinel data courtesy of ESA/Copernicus.

Last Tuesday I was at the academic launch event for the Tru-Nject project at Cranfield University. Despite the event’s title, it was in fact an end of project meeting. Pixalytics has been involved in the project since July 2015, when we agreed to source and process high resolution satellite Earth Observation (EO) imagery for them.

The Tru-Nject project is funded via Innovate UK. It’s official title is ‘Tru-Nject: Proximal soil sensing based variable rate application of subsurface fertiliser injection in vegetable/ combinable crops’. The focus is on modelling soil fertility within fields, to enable fertiliser to be applied in varying amounts using point-source injection technology which reduces the nitrogen loss to the atmosphere when compared with spreading fertiliser on the soil surface.

To do this the project created soil fertility maps from a combination of EO products, physical sampling and proximal soil sensing – where approximately 15 000 georeferenced hyperspectral spectra are collected using an instrument connected to a tractor. These fertility maps are then interpreted by an agronomist, who decides on the relative application of fertiliser.

Initial results have shown that applying increased fertiliser to areas of low fertility improves overall yield when compared to applying an equal amount of fertiliser everywhere, or applying more fertiliser to high yield areas.

Pixalytics involvement in the work focussed on acquiring and processing, historical, and new, sub 5 metre optical satellite imagery for two fields, near Hull and York. We have primarily acquired data from the Kompsat satellites operated by the Korea Aerospace Research Institute (KARI), supplemented with WorldView data from DigitalGlobe. Once we’d acquired the imagery, we processed it to:

  • remove the effects of the atmosphere, termed atmospheric correction, and then
  • converted them to maps of vegetation greenness

The new imagery needed to coincide with a particular stage of crop growth, which meant the satellite data acquisition period was narrow. This led to a pleasant surprise for Dave George, Tru-Nject Project Manager, who said, “I never believed I’d get to tell a satellite what to do.’ To ensure that we collected data on specific days we did task the Kompsat satellites each year.

Whilst we were quite successful with the tasking the combination of this being the UK, and the fact that the fields were relatively small, meant that some of the images were partly affected by cloud. Where this occurred we gap-filled with Copernicus Sentinel-2 data, it has coarser spatial resolution (15m), but more regular acquisitions.

In addition, we also needed to undertake vicarious adjustment to ensure that we produced consistent products over time whilst the data came from different sensors with different specifications. As we cannot go to the satellite to measure its calibration, vicarious adjustment is a technique which uses ground measurements and algorithms to not only cross-calibrate the data, but also adjusts for errors in the atmospheric correction.

An example of the work is at the top, which shows a Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. The greener the NDVI product the more green the vegetation is, although the two datasets were collected in different years so the planting within the field varies.

We’ve really enjoyed working with Stockbridge Technology Centre Ltd (STC), Manterra Ltd, and Cranfield University, who were the partners in the project. Up until last week all the work was done via telephone and email, and so it was great to finally meet them in-person, hear about the successful project and discuss ideas for the future.

Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Remote Sensing Goes Cold

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Remote sensing over the Polar Regions has poked its head above the ice recently.

On the 8th February The Cryosphere, a journal of the European Geosciences Union, published a paper by Smith et al titled ’Connected sub glacial lake drainage beneath Thwaites Glacier, West Antarctica’. It described how researchers used data from ESA’s CryoSat-2 satellite to look at lakes beneath a glacier.

This work is interesting from a remote sensing viewpoint as it is a repurposing of Cryosat-2’s mission. It’s main purpose is to measure the thickness of the ice sheets and marine ice cover using its Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter, known as SIRAL, and it can detect millimetre changes in the elevation of both ice-sheets and sea-ice.

The team were able to use this data to determine that the ice of the glacier had subsided by several metres as water had drained away from four lakes underneath. Whilst the whole process took place between June 2012 and January 2014, the majority of the drainage happened in a six month period. During this time it’s estimated that peak drainage was around 240 cubic metre per second, which is four times faster than the outflow of the River Thames into the North Sea.

We’ve previously highlighted that repurposing data – using data for more purposes than originally intended – is going to be one of the key future innovation trends for Earth Observation.

Last week, ESA also described how Sentinel-1 and Sentinel-2 data have been used over the last five months to monitor a crack in the ice near to the Halley VI research base of the British Antarctic Survey (BAS). The crack, known as Halloween Crack, is located on the Brunt ice Shelf in the Wedell Sea sector of Antarctica and was identified last October. The crack grew around 600 m per day during November and December, although it has since slowed to only one third of that daily growth.

Since last November Sentinel-2 has been acquiring optical images at each overflight, and this has been combined with SAR data from the two Sentinel-1 satellites. This SAR data will be critical during the Antarctic winter when there are only a few hours of daylight and a couple of weeks around mid-June when the sun does not rise.

This work hit the headlines as BAS decided to evacuate their base for the winter, due to the potential threat. The Halley VI base, which was only 17km from the crack, is the first Antarctic research station to be specifically designed to allow relocation to cope with this sort of movement in the ice shelf. It was already planned to move the base 23 km further inland, and this was successfully completed on the 2nd February. Further movement will depend on how the Halloween Crack develops over the winter.

Finally, the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) project was announced this week at the annual meeting of the American Association for the Advancement of Science. Professor Markus Rex outlined the project, which will sail a research vessel into the Arctic sea ice and let it get stuck so it can drift across the North Pole. The vessel will be filled with a variety of remote sensing in-situ instruments, and will aim to collect data on how the climate is changing in this part of the world through measuring the atmosphere-ice-ocean system.

These projects show that the Polar Regions have a lot of interest, and variety, for remote sensing.

Supporting Chimpanzee Conservation from Space

Gombe National Park, Tanzania. Acquired by Sentinel-2 in December 2016. Image courtesy of ESA.

Being able to visualise the changing face of the planet over time is one of the greatest strengths of satellite remote sensing. Our previous blog showed how Dubai’s coastline has evolved over a decade, and last week NASA described interesting work they’re doing on monitoring habitat loss for chimpanzees in conjunction with the Jane Goodall Institute.

Jane Goodall has spent over fifty years working to protect and conserve chimpanzees from the Gombe National Park in Tanzania, and formed the Jane Goodall Institute in 1977. The Institute works with local communities to provide sustainable conservation programmes.

A hundred years ago more than one million chimpanzees lived in Africa, today the World Wildlife Fund estimate the population may only be around 150,000 to 250,000. The decline is stark. For example, the Ivory Coast populations have declined by 90% within the last twenty years.

One of the key factors contributing to this decline is habitat loss, mostly through deforestation; although other factors such as hunting, disease and illegal capture also contributed.

Forests cover around 31% of the planet, and deforestation occurs when trees are removed and the land has another use instead of being a forest. In chimpanzee habitats, the deforestation is mostly due to logging, mining and drilling for oil. This change in land use can be monitored from space using remote sensing. Satellites produce regular images which can be used to monitor changes in the natural environment, in turn giving valuable information to conservation charities and other organisations.

In 2000 Lilian Pintea, from the Jane Goodall Institute, was shown Landsat images comparing the area around the Gombe National Park in 1972 and 1999. The latter image showed huge deforestation outside the park’s boundary. The Institute have continued to use Landsat imagery to monitor what is happening around the National Park. In 2009 they began a citizen science project with local communities giving them smartphones to report their observations. Combining these with ongoing satellite data from NASA has helped develop and implement local plans for land use and protection of the forests. Further visualisation of this work can be found here. The image at the top was acquired Sentinel-2 in December 2016 and shows the Gombe National Park, although it is under a little haze.

The satellite data supplied by NASA comes from the Landsat missions, which currently have an archive of almost forty-five years of satellite data, which is freely available to anyone. We also used Landsat for data in our Dubai animation last week. Landsat captures optical data, which means it operates in a similar manner to the human eye – although the instruments also have infrared capabilities. However, one drawback of optical instruments is that they cannot see through clouds. Therefore, whilst Landsat is great for monitoring land use when there are clear skies, it can be combined with synthetic aperture radar (SAR), from the microwave spectrum, as it can see through both clouds and smoke. This combination enables land use and land change to monitored anywhere in the world. Using the freely available Landsat and Sentinel-1 SAR data you could monitor what is happening to the forests in your neighbourhoods.

Satellite data is powerful tool for monitoring changes in the environment, and with the archive of data available offers a unique opportunity to see what has happened over the last four decades.