Is This The Worst Global Coral Bleaching Event Ever?

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

It was announced last week that 93% of the Great Barrier Reef has been hit by coral bleaching due to rising sea temperatures from El Niño and climate change. We first wrote about the third worldwide coral bleaching in October 2015, noting this year’s event could be bad. Those fears would appear to be coming true with the results of Australia’s National Coral Bleaching Task Force aerial survey of 911 coral reefs which found 93% had suffered from bleaching; of which 55% had suffered severe bleaching.

Coral bleaching occurs when water stresses cause coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching.

Bleaching does not kill coral immediately, but puts them at a greater risk of mortality. Recovery is also possible if the water stress reduces and normal conditions return, which is what is hoped for in the Northern Sector of the reef above Port Douglas, where around 81% of corals had suffered severe bleaching – the water quality in this area is good, which should also aid recovery. The reefs fared better further south. Within the Central Sector, between Port Douglas and Mackay, 75 of the 226 reefs suffered from severe bleaching. Whilst in the Southern Sector below MacKay only 2 reefs suffered severe bleaching and 25% had no bleaching.

The news is not all bad. A survey of the coral reefs of the Andaman and Nicobar Islands, a territory of India that marks the dividing line between the Bay of Bengal & Andaman Sea, also published this week shows no evidence of coral bleaching. This survey is interesting for remote sensors as it was undertaken by a remotely operated vehicle, PROVe, developed by India’s National Institute of Ocean Technology. As well as mapping the coral reefs, PROVe has a radiometer attached and is measuring the spectral signatures of the coral in the area, which could be used to support the monitoring of corals from satellites.

Monitoring coral bleaching from space has been done before. For example, Envisat’s MERIS sensor was determined to be able to detect coral bleaching down to a depth of ten metres, or the Coral Bleaching Index (Ziskin et al, 2011) which uses the red, green and blue bands to measure increases in spectral reflectance in bleached corals. Given the size, geographical area and oceanic nature of corals, satellite remote sensing should be able to offer valuable support to the monitoring of their health.

Following the second global bleaching event, in 1997/98, research confirmed that 16 percent of the world’s coral died. Who knows what the outcome of the current event will be?

Sentinel’s Milestone and Millstone

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

There was be a significant milestone achieved for the European Commission’s Copernicus Programme with the launch of the Sentinel-1B satellite. It was the fourth satellite launched, and will complete the first of the planned constellations as the second Sentinel-1 satellite.

It was launched on 25th April from French Guiana. In addition, to Sentinel-1B, three student cubesats were onboard the Soyuz rocket. Students from the University of Liege, Polytechnic of Turin, Italy, and the University of Aalborg have developed 10cm square cubesats as part of ESA’s ‘Fly Your Satellite!’ programme which will be deployed into orbit.

Sentinel-1B is an identical twin to Sentinel-1A which was launched on the 3rd April 2014, and they will operate as a pair constellation orbiting 180 degrees apart at an altitude of approximately 700 km. They both carry a C-band Synthetic Aperture Radar (SAR) instrument and together will cover the entire planet every six days, although the Arctic will be revisited every day and Europe, Canada and main shipping routes every three days.

Sentinel-1 data has a variety of applications including monitoring sea ice, maritime surveillance, disaster humanitarian aid, mapping for forest, water and soil management. The benefits were demonstrated this week with:

  • Issuing a video showing the drop in rice-growing productivity in Mekong River Delta over the last year; and
  • The multi-temporal colour composite of land coverage of Ireland as shown at the top of this post. It was created from 16 radar scans over 12 days during May 2015, where:
    • The blues represent changes in water or agricultural activities such as ploughing, the yellows represent urban centres, vegetated fields and forests appear in green and the reds and oranges represent unchanging features such as bare soil.

With this constellation up and working, the revisit speed has the chance to be the game changer in the uptake of space generated data.

Sadly there’s a millstone hanging around the Copernicus Programme neck hindering this change – accessing the data remains difficult for commercial organisations.

Currently, selecting and downloading Sentinel data is a painful process, one that mostly either does not work, or is so slow you give up on it! This has been created by the size of the datasets and popularity of the data that’s free to access for everyone worldwide.

There are a number of ways of getting access to this data, with varying success in our experience, including:

  • EU’s Copernicus Hub – Operational, but slow to use. Once you have selected the data to download, either manually or via a script, the process is extremely slow and often times out before completing the downloading.
  • USGS – Offers Sentinel-2, but not Sentinel-1, data via it’s EarthExplorer and Glovis interfaces. The download process is easier, but the format of Sentinel-2 makes searching a bit strange in Glovis and it’s only a partial representation of the available acquisitions.
  • The UK Collaborative Ground Segment Access, despite signing an agreement with ESA in March 2015, has not yet been made available for commercial entities.
  • It is possible to apply for access to the academically focused STFC Centre for Environmental Data Analysis (CEDA) system, which provides FTP access, and that has good download speed’s for the data that’s available.
  • Amazon’s archive of Sentinel-2 data which has good download speeds, but is cumbersome to search without the development of software i.e. scripts.

There are also further services and routes being developed to facilitate searching and downloading from the various archives, e.g., there’s a QGIS ‘Semi-Automatic Classification’ plugin and EOProc SatCat service for Sentinel-2. With the Sentinel-3A data coming online soon the situation will get more complex for those of us trying to use data from all the Sentinel missions.

Getting the satellites into space is great, but that is only the first step in widening the use of space generated data. Until the data is put into the hands of people who use it to create value and inspire people, the Sentinel data will not fulfill its full potential in widening the use of space generated data.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!

Blog of Many Colours

Image featuring the sister cities of Sault Sainte Marie, Ontario, and Sault Sainte Marie, Michigan. ESA’s Proba satellite acquired this image on 11 August 2006 with its Compact High Resolution Imaging Spectrometer (CHRIS), designed to acquire hyperspectral images with a spatial resolution of 18 metres across an area of 14 kilometres. Data courtesy of SSTL through ESA.

Image featuring the sister cities of Sault Sainte Marie, Ontario, and Sault Sainte Marie, Michigan. ESA’s Proba satellite acquired this image on 11 August 2006 with its Compact High Resolution Imaging Spectrometer (CHRIS), designed to acquire hyperspectral images with a spatial resolution of 18 metres across an area of 14 kilometres. Data courtesy of SSTL through ESA.

The aspect of art at school that really stuck with me was learning about the main colours of the rainbow and how they fit together – like with like, such as yellow, green, blue, and like with unlike such as shades of green with a fleck of red to put spark into a picture. Based on these ideas, when I was a teenager I used to construct geometric mandalas coloured in with gouache. As I began studying remote sensing, it seemed natural that hyperspectral imaging would hold a special fascination.

The term Hyperspectral Imaging was coined by Goetz in 1985 and is defined as ‘the acquisition of images in hundreds of contiguous, registered, spectral bands such that for each pixel a radiance spectrum can be derived.’ Put simply, whereas a picture is made using three colour components for television (red, green and blue), for hyperspectral imaging the spectrum is split into many, sometimes hundreds, of different grades of colour for each part of the image. The term made its way into scientific language by way of the intelligence communities – the military became interested in it as it offered them the ability to tell plastic decoys from real metal tanks, as well as an object’s precise colour.

When the first field spectral measurements were conducted in the early 1970s, technology was not advanced enough for it to be put into operation. However, developments in electronics, computing and software throughout the 1980s and into the 1990s, brought the hyperspectral imaging to the EO community.

A series of parallel hardware development began in the 1980’s, such as at NASA JPL with the Airborne Imaging Spectrometer (AIS) in 1983, followed by AVIRIS (Airborne Visible/IR Imaging Spectrometer). The AVIRIS sensor was first flown in 1987 on a NASA aircraft at 20km altitude and to this day, it is still a key provider of high-quality HS data for the scientific community.

The hardware advances were matched by improvements in software capabilities, with the development of the iconic image cube method of handling this type of data, by PhD students Joe Boardman and Kathryn Kierein-Young, from the University of Colorado. Spectral libraries have been amassed for over 2,400 natural and artificial materials, to enable them to be identified. The most famous is the ASTER spectral library which includes inputs from Johns Hopkins University (JHU) Spectral Library, the Jet Propulsion Laboratory (JPL) Spectral Library, and the United States Geological Survey (USGS – Reston) Spectral Library.

Hyperspectral imaging was primarily developed for the mapping of soils and rock types; and the spectra of these are rich in character. Taking regions from the contiguous spectrum makes it possible to identify surface materials by reflectance or emission and also allows precise atmospheric correction which can only be approximated if you are using discrete, wide colour bands. The shape of the reflectance or emittance spectrum yields information about grain size, abundance and composition as well as the biochemistry of vegetation, such as the concentration of chlorophyll and other pigments and life forms in water bodies.

Earth observation hyperspectral imaging really began with NASA’s Earth Observing-1 Mission (EO-1) launched in 2000, with the Hyperion imager on board that has 200 wavelengths. Since then, various other missions have been launched such as the Compact High Resolution Imaging Spectrometer (CHRIS) on the Proba-1 satellite also in 2001, with 63 spectral bands; or the Infrared Atmospheric Sounding Interferometer (IASI) on board the MetOp series of Meteorological satellites whose first version was launched in 2006.

The coming years for hyperspectral imaging looks exciting with a whole series of planned missions including the Italian PRISMA (PRecursore IperSpettrale della Missione Applicativa), German EnMAP (Environmental Mapping and Analysis Program), NASA’s HyspIRI (Hyperspectral Infrared Imager), and JAXA’s (Japan Aerospace Exploration Agency) Hyperspectral Imager Suite (HIUSI).

So for me, and anyone with the same fascination, the future really will be filled with many colours!

Blog written by Dr Louisa Reynolds

Satellite Data Continuity: Hero or Achilles Heel?

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

One of satellite remote sensing’s greatest strengths is the archive of historical data available, allowing researchers to analyse how areas change over years or even decades – for example, Landsat data has a forty year archive. It is one of the unique aspects of satellite data, which is very difficult to replicate by other measurement methods.

However, this unique selling point is also proving an Achilles Heel to industry as well, as highlighted last week, when a group of 179 researchers issued a plea to the European Commission (EC) and the European Space Agency (ESA) to provide a replacement for the aging Cryosat-2 satellite.

Cryosat-2 was launched in 2010, after the original Cryosat was lost during a launch failure in 2005, and is dedicated to the measurement of polar ice. It has a non sun-synchronous low earth orbit of just over 700 km with a 369 day ground track cycle, although it does image the same areas on Earth every 30 days. It was originally designed as three and half year mission, but is still going after six years. Although, technically it has enough fuel to last at least another five years, the risk of component failure is such that researchers are concerned that it could cease to function at any time

The main instrument onboard is a Synthetic Aperture Interferometric Radar Altimeter (SIRAL) operating in the Ku Band. It has two antennas that form an interferometer, and operates by sending out bursts of pulses at intervals of only 50 microseconds with the returning echoes correlated as a single measurement; whereas conventional altimeters send out single pulses and wait for the echo to return before sending out another pulse. This allows it to measure the difference in height between floating ice and seawater to an accuracy of 1.3cm, which is critical to measurement of edges of ice sheets.

SIRAL has been very successful and has offered a number of valuable datasets including the first complete assessment of Arctic sea-ice thickness, and measurements of the ice sheets covering Antarctica and Greenland. However, these datasets are simply snapshots in time. Scientists want to continue these measurements in the coming years to improve our understanding of how sea-ice and ice sheets are changing.

It’s unlikely ESA will provide a follow on satellite, as their aim is to develop new technology and not data continuity missions. This was part of the reason why the EU Copernicus programme of Sentinel satellites was established, whose aim is to provide reliable and up to date information on how our planet and climate is changing. The recently launched Sentinel-3 satellite can undertake some of the measurements of Cryosat-2, it is not a replacement.

Whether the appeal for a Cryosat-3 will be heard is unclear, but what is clear is thought needs to be given to data continuity with every mission. Once useful data is made available, there will be a desire for a dataset to be continued and developed.

This returns us to the title of the blog. Is data continuity the hero or Achilles Heel for the satellite remote sensing community?

Crowd Sourced Alphabets from Space

From the catalogue of the Aerial Bold Project.

From the catalogue of the Aerial Bold Project.

Today we’re looking at an unusual application of satellite data, namely using it to create alphabets.

The first example is the Aerial Bold Project, established by Benedikt Groß and Joey Lee in 2014. It set out to create typeface fonts from satellite imagery and they set-up a kickstarter campaign in November 2014 provide the funding for the project. The campaign raised $11,492 in thirty days from by 569 backers, and we were one of them! Regular blog readers will know we love anything quirky to do with satellite imagery, and this was something we wanted to support.

Having secured funding, the project established a crowd-sourcing app, called the Letter Finder, to allow anyone to review Mapbox satellite imagery to try and identify landscape features, natural or man-made, that looked like letters or numbers – and over 11,400 were identified from 22 different countries.

Our P from the Aerial Bold Project.

Our P from the Aerial Bold Project.

All images were recorded, catalogued, and rated for beauty and readability. As one of the backers we were entitled to own one of the letters in the catalogue, and it will come as no surprise that we went for a P for Pixalytics, which you can see here! It is a small island within the Zevenhuizerplas, a 1.5 km water body created by a dam on the northeast outskirts of Rotterdam, Netherlands. Our letter received top marks for beauty, but not for readability; we can’t really argue with that assessment, as technically the P is lying on its side on real life!

The Aerial Bold Project reviewed the catalogued letters to determine the best examples to create actual fonts. So far they have created three: Buildings, Suburbia and Provence. Sadly, our letter was not selected for any of the fonts. However, we think our P is beautiful and for us, a satellite image of trees surrounded by water is a great metaphor for the company.

The project website gives you access to the fonts, the whole catalogue of letters and a fun typewriter app which allows you create messages from the sourced images – as we have done with Pixalytics at the top of the blog.

Whilst researching this blog, I came across a second crowd-sourced alphabet from space! This one was co-ordinated by Adam Voiland, a science writer for NASA Earth Observatory. After seeing the letter V in smoke on a satellite image, he sought help from colleagues and readers to create an entire alphabet based on imagery and photographs taken by NASA.

Adam’s version is different from the Ariel Bold Project as it uses letters created by images of transient things such as smoke or phytoplankton plumes within the alphabet. It took Adam over three years to complete his alphabet and all twenty-six letters can be seen on the NASA Earth Observatory website. Being satellite geeks, we love the fact that Adam has captioned each image not only explaining what the image is of, but also describing the satellite and sensor that took the picture! It was great to see such a range of satellites used within the alphabet.

These alphabets are fantastic examples of unusual, innovative and fun ways of using satellite data, and we liked the fact that both were crowd-sourced. Why don’t you have a look at, and a play with, these crowd-sourced alphabets from space!

Identifying Urban Sprawl in Plymouth

Map showing urban sprawl over last 25 years in the areas surrounding Plymouth

Map showing urban sprawl over last 25 years in the areas surrounding Plymouth

Nowadays you can answer a wide range of environmental questions yourself using only open source software and free remote sensing satellite data. You do not need to be a researcher and by acquiring a few skills you can the analysis of complex problems at your fingertips. It is amazing.

I’ve been based at Pixalytics in Plymouth, over the last few months, on an ERAMUS+ placement and decided to use Plymouth to look at one of the most problematic environmental issues for planners: Urban Sprawl. It is well known phenomenon within cities, but it can’t be easily seen from ground level – you need to look at it from space.

The pressure of continued population growth, the need for more living space, commercial and economic developments, means that central urban areas tend to expand into low-density, monofunctional and usually car-dependent communities with a high negative ecological impact on fauna and flora associated with massive loss in natural habitats and agricultural areas. This change in how land is used around cities is known urban sprawl.

As a city Plymouth suffered a lot of destruction in World War Two, and there was a lot of building within the city in the 1950s and 1960s. Therefore, I decided to see if Plymouth has suffered from urban sprawl over the last twenty-five years, using open source software and data. The two questions I want to answer are:

  1. Is Plymouth affected by urban sprawl? and
  2. If it is, what are Plymouth’s urban sprawl trends?

1) Is Plymouth affected by urban sprawl?
To answer this question I used the QGIS software to analysis Landsat data from both 1990 and 2015, together with OpenStreetMap data for natural areas for a 15 kilometre area starting from Plymouth’s City Centre.

I then performed a Landscape Evolution analysis, as described in Chapter 9 of the Practical Handbook of Remote Sensing, written by Samantha and Andrew Lavender from Pixalytics. Firstly, I overlaid natural areas onto the map of Plymouth, then added the built up areas from 2015 shown in red and finally added the 1990 built-up areas in grey.

Detailed map showing the key urban sprawl around Plymouth over last 25 years

Detailed map showing the key urban sprawl around Plymouth over last 25 years

The map, which has an accuracy of 80 – 85%, shows you, no major urban development occurred in the city of Plymouth and its surroundings in the last 25 years – this is of course about to change the development of the new town of Sherford on the outskirts of the city.

However, as you can see in the zoomed in version of the map on the right, there is a noticeable urban development visible in the north west of the city and a second in Saltash in Cornwall on the east of the map. The built up area in the 15km area around Plymouth increased by around 15% over the 25 year period. The next question is what are the trends of this sprawl.

2) What are Plymouth urban sprawl trends?
A large amount of research tries to categorize urban sprawl in various types:

  • Compact growth which infill existing urban developments, also known as smart growth, and mainly occurs in planning permitted areas
  • Linear development along main roads
  • Isolated developments into agricultural or wildlife areas in proximity with major roads.

These last two have a bad reputation and are often associated with negative impacts on environment.

Various driving forces are behind these growth types, creating different patterns for cities worldwide. For example, rapid economic development under a liberal planning policy drives population growth in a city which then is expands and incorporates villages located in near or remote proximity over time. This is fragmented approach, and results in a strong land loss.

But this is not the case for Plymouth which in the last 25 years showed a stable development in the extend permitted by planning policies with a predominant infill and compact expansion, a smart growth approach that other cities could take as an example.

These conclusions can be taken following only a few simple steps- taking advantage of free open source software and free data, without extensive experience or training.
This is a proven example of how you can make your own maps at home without investing too much time and money.

This is the end my internship with Pixalytics, and it has been one of my best experiences.

Blog written by Catalin Cimpianu, ERASMUS+ Placement at Pixalytics.

Four Fantastic Forestry Applications of Remote Sensing

Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Monitoring forest biomass is essential for understanding the global carbon cycle because:

  • Forests account for around 45 % of terrestrial carbon, and deforestation accounts for 10% of greenhouse gas emissions
  • Deforestation and forest degradation release approximately identical amounts of greenhouse gases as all the world’s road traffic
  • Forests sequester significant amounts of carbon every year

The United Nations (UN) intergovernmental Reducing Emissions from Deforestation and forest Degradation in developing countries (REDD+) programme, was secured in 2013 during the 19th Conference of the Parties to the UN Framework Convention on Climate Change. It requires countries to map and monitor deforestation and forest degradation, together with developing a system of sustainable forest management. Remote sensing can play a great role in helping to deliver these requirements, and below are three fantastic remote sensing initiatives in this area.

Firstly, the Real Time System for Detection of Deforestation (DETER) gives monthly alerts on potential areas of deforestation within Amazon rainforests. It uses data from MODIS, at 250 m pixel resolution, within a semi-automated classification technique. A computer model detects changes in land use and cover such as forest clearing that are then validated by interpreters. It has been valuable helping Brazil to reduce deforestation rates by around 80% over the last decade; however, it takes two weeks to produce the output of this computer model.

Zoomed in Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

Zoomed in Landsat Images of the south-east area of Bolivia around Santa Cruz de la Sierra 27 years apart showing the changes in land use. Data courtesy of USGS/NASA.

A similar initiative is FORest Monitoring for Action (FORMA), which also use MODIS data. FORMA is fully automated computer model which combines vegetation reflectance data from MODIS, active fires from NASA’s Fire Information for Resource Management and rainfall figures, to identify potential forest clearing. Like DETER it produces alerts twice a month, although it works on tropical humid forests worldwide.

A third initiative aims to provide faster alerts for deforestation using the research by Hansen et al, published in 2013. The researchers used successive passes of the current Landsat satellites to monitor land cover, and when gaps appear between these passes it is flagged. These will be displayed on an online map, and the alerts will be available through the Word Resources Institute’s Global Forest Watch website, starting in March 2016. With the 30 m resolution of Landsat, smaller scale changes in land use can be detected than is possible for sensors such as MODIS. Whilst this is hoped to help monitor deforestation, it doesn’t actually determine it, as they could be other reasons for the tree loss and further investigation will be required. Being an optical mission, Landsat has problems seeing both through clouds and beneath the forestry canopy, and so it’s use will be limited in areas such as tropical rain forests.

Finally, one way of combat the weather and satellite canopy issue is to use radar to assess forests, and the current AfriSAR project in Gabon is doing just that – although it’s with flights and Unmanned Aerial Vehicles (UAV) rather than satellites. It began in the 2015 with overflights during the dry season, and the recent flights in February 2016 captured the rainy season. This joint ESA, Gabonese Space Agency and Gabon Agency of National Parks initiative aims of the project is to determine the amount of biomass and carbon stored in forests, by using the unique sensitivity of P-band SAR, the lowest radar frequency used in remote sensing at 432–438 MHz. NASA joined the recent February missions adding its Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) and the Land, Vegetation and Ice Sensor (LVIS) instrument, which are prototypes of sensors to be used on future NASA missions. Overall, this is giving a unique dataset on the tropical forests.

These are just four example projects of how remote sensing can contribute towards towards understanding what is happening in the world’s forests.

Riding the Wavelength 2016

View from Mullard Space Science Laboratory

View from Mullard Space Science Laboratory

Wavelength 2016, the Remote Sensing & Photogrammetry Society’s annual conference for remote sensing students and early career scientists took place last week. The venue was the Mullard Space Science Laboratory (MSSL) at Holmbury St. Mary, Surrey, whose two hundred year old main building is deceptively stately for a science lab – panelled and furnished with seasoned wood – and had a previous life as an orphanage amongst other things.

Pixalytics, who also sponsored the event, sent two delegates this year: Dr Louisa Reynolds and Catalin Cimpianu, our ERASMUS placement student.

The conference offers a strong scientific programme of keynotes, oral presentations and posters. Catalin gave an oral presentation on the research he has been doing during his placement with us, on ‘Monitoring Urban Sprawl Patterns in the Post-Socialist Romanian Cities Using LANDSAT Imagery’. His presentation seemed to go down well with other attendees, given the questions and feedback he received.

Overall, Catalin really enjoyed the conference and found the other delegates very friendly. He felt the student presentations and posters were based on solid research, and covered a diversity of work from missions to Mars through to expeditions in Antarctica for monitoring penguin colonies. They all proved the usefulness of remote sensing and photogrammetry, together with the need for monitoring features on the Earth to get a better understanding and support sustainable future development.

Both Pixalytics representatives acknowledged the presence of some impressive keynote speakers, particularly Professor Jan-Peter Muller, Head of Imaging Group at MSSL and Kathie Bowden, UK National Space Skills and Career Development Manager at the UK Space Agency. Louisa chaired the session on Vegetation Remote Sensing, but that was not her conference highlight.

As well as offering a strong scientific programme, Wavelength also offers a highly active social scene, and for Louisa the highlight was the tour round MSSL. Seeing high precision satellite electronics being built was exciting, and learning that soldering together two of the tiny hair like legs on a ball grid array by mistake could mean the failure of a sensor demonstrated the precision needed in satellite engineering.

Component for a solar wind analyser

Component for a solar wind analyser

On the tour Louisa also saw ‘in the flesh’ the work benches, sealed and unsealed, for making the components and their housings, to fit inside part of a solar wind analyser seen in the picture on the left. Ensuring dust and water are driven from sensor components is essential to avoid condensation and inaccuracies, something we are very aware of within our work. One of the most interesting things Louisa gained from the tour was the importance of materials science for satellite engineering, such as the indispensability of lead and the lightness of aluminium. She also enjoyed the impressive cuisine of the local restaurants!

The conference generated many ideas on the latest trends and updates in Earth observation, together with suggestions on how to develop skills professional qualifications in the field. The summary of the conference comes from Catalin who said:

‘Well organized conference, the venue, the food, social activities, the attention to details and the organizational skills of the hosts were unquestionable and they proved to be very welcoming and hospitable.‘

Well done to everyone involved in Wavelength 2016, we look forward to being involved again next year!

Satellite is SPOT on!

July 2009 SPOT image of Grand Cayman, data courtesy of ESA / CNES

July 2009 SPOT image of Grand Cayman, data courtesy of ESA / CNES

This week marks the thirtieth anniversary of the launch of the first SPOT satellite, making it one of the longest satellite time series datasets.

The French Space Agency (CNES) established the series of Satellites Pour l’Observation de la Terre, known as SPOT. SPOT-1 was launched on the 22nd February 1986 and was fitted with revolutionary steerable mirrors, meaning the satellite could look in multiple directions enabling it to observe the same point on the Earth every five days.

Since SPOT-1 there have been six subsequent satellite launches, with a strong thread of sensor consistency throughout the series, meaning it is much easy to compare imagery over time. Currently, there are three active satellites in the series – SPOT-5, SPOT-6 and SPOT-7.

Looking at SPOT’s history:

  • SPOT-1, SPOT-2 and SPOT-3 all had identical imaging sensors, namely a panchromatic band with 10 m spatial resolution, and multispectral bands in green, red and near infrared (NIR) with 20 m resolution.
  • SPOT-4 had identical multispectral bands to its predecessors, but it also added a middle IR (MIR) band. The panchromatic band operated at slightly different wavelengths, but with the same resolution. In addition, SPOT-4 also carried the first Vegetation instrument with blue, red, NIR and MIR bands at a 1 km resolution that effectively gave daily global coverage.
  • SPOT-5, launched in 2002, offered a step change in spatial resolution. The multispectral green, red and NIR bands had an improved resolution of 10 m. The panchromatic band was at 5m, and returned to its original wavelengths. The vegetation sensor was identical to that flown on SPOT-4.
  • The latest incarnations of the series, SPOT-6 and SPOT-7, launched in 2011 and 2014 respectively, operate as a tandem constellation and again offer an improvement in resolution. The panchromatic band is down to 1.5m, and the multispectral green, red and NIR bands are down to 6m. They are expected to provide data through to 2024.

A time-series continuity of the Vegetation Sensor has been provided by the Belgian Proba-V satellite that launched in 2013, and will be carried on into the future by the OLCI sensor on the recently launched Sentinel-3A mission.

Unlike the oldest time series data, Landsat, SPOT is still categorised as a commercial dataset and its imagery has to be purchased.

The US Geological Survey does have a contract to provide historical data from SPOT-4 and SPOT-5 for North America for some US government staff, and ESA provide limited datasets for approved projects. There was an announcement by the French Government in 2014 that SPOT satellite data, over 5 years old, would be free of charge for non-commercial users – although we’ve struggled to find it!

SPOT’s applications have included exploring for gas, oil and minerals including routing pipelines; mapping the planet including forestry, topographical maps and urban planning; agriculture data to combat drought and support farmers decision making; emergency rapid response information for disaster relief and urban planning. In addition, from SPOT-5 onwards DEM’s can be created using photogrammetry techniques, because of the instruments produce stereoscopic images allowing minute changes on the Earth to measured; and the global coverage of Vegetation sensor has also contributed towards climate change research.

SPOT-1 provided imagery ten days after the Chernobyl disaster, and also picked up photosynthesis in the area in 1988 using its NIR sensor, and was recently used to look at camps for Syrian Refuges in recent times.

The SPOT series of satellites have made a huge contribution the development of remote sensing time series datasets, and that’s worth celebrating.