Goodbye to EO-1

Hyperspectral data of fields in South America classified using Principle Components Analysis. Data acquired by Hyperion. Image courtesy of NASA.

In contrast to our previous blog, this week’s is a celebration of the Earth Observing-1 (EO-1) satellite whose death will soon be upon us.

EO-1 was launched on the 21st November 2000 from Vandenberg Air Force Base, California. It has a polar sun-synchronous orbit at a height of 705 km, following the same orbital track as Landsat-7, but lagging one minute behind. It was put into this orbit to allow for a comparison with Landsat 7 images in addition to the evaluation of EO-1’s instruments.

It was the first in NASA’s New Millennium Program Earth Observing series, which had the aim of developing and testing advanced technology and land imaging instruments, particularly related to spatial, spectral and temporal characteristics not previously available.

EO-1 carries three main instruments:

  • Hyperion is an imaging spectrometer which collects data in 220 visible and infrared bands at 30 m spatial resolution with a 7.5 km x 100 km swath. Hyperion has offered a range of benefits to applications such as mining, geology, forestry, agriculture, and environmental management.
  • Advanced Land Imaging (ALI) is a multispectral imager capturing 9 bands at 30 m resolution, plus a panchromatic band at 10 m, with a swath width of 37 km. It has the same seven spectral bands as Landsat 7, although it collects data via a different method. ALI uses a pushbroom technique where the sensor acts like a broom head and collects data along a strip as if a broom was being pushed along the ground. Whereas Landsat operates a whiskbroom approach which involves several linear detectors (i.e., broom heads) perpendicular (at a right angle) to the direction of data collection. These detectors are stationary in the sensor and a mirror underneath sweeps the pixels from left to right reflecting the energy from the Earth into the detectors to collect the data.
  • Atmospheric Corrector (LAC) instrument allows the correction of imagery for atmospheric variability, primarily water vapour, by measuring the actual rate of atmospheric absorption, rather than using estimates.

The original EO-1 mission was only due to be in orbit only one year, but with a sixteen year lifetime it has surpassed all expectations. The extension of the one year mission was driven by the Earth observation user community who were very keen to continue with the data collection, and an agreement was reached with NASA to continue.

Psuedo-true colour hyperspectral data of fields in South America. Data acquired by Hyperion. Image courtesy of NASA.

All the data collect by both Hyperion and ALI is freely available through the USGS Centre for Earth Resources Observation and Science (EROS). At Pixalytics we’ve used Hyperion data for understanding the capabilities of hyperspectral data. The two images shown in the blog are a subset of a scene acquired over fields in South America, with image to the right is a pseudo-true colour composite stretched to show the in-field variability.

Whereas the image at the top is the hyperspectral data classified using a statistical procedure, called Principle Components Analysis (PCA), which extracts patterns from within the dataset. The first three derived uncorrelated variables, termed principle components, are shown as a colour composite.

Sadly, satellites cannot go on forever, and EO-1 is in its final few weeks of life. It stopped accepting data acquisition requests on the 6th January 2017, and will stop providing data by the end of February.

It has been a great satellite, and will be sadly missed.

Living Planet Is Really Buzzing!

Living planet rotating global in the exhibition area, photo: S Lavender

Living planet rotating global in the exhibition area, photo: S Lavender

This week I’m at the 2016 European Space Agency’s Living Planet Symposium taking place in sunny Prague. I didn’t arrive until lunchtime on Monday and with the event already underway I hurried to the venue. First port of call was the European Association of Remote Sensing Companies (EARSC) stand as we’ve got copies of flyers and leaflets on their stand. Why not pop along and have look!

The current excitement and interest in Earth observation (EO) was obvious when I made my way towards the final sessions of the day. The Sentinel-2 and Landsat-8 synergy presentations were packed out, all seats taken and people were crowding the door to watch!

I started with the Thematic Exploitation Platforms session. For a long time the remote sensing community has wanted more data, and now we’re receiving it in ever larger quantities e.g., the current Copernicus missions are generating terabytes of data daily. With the storage requirements this generates there is a lot of interest in the use of online platforms to hold data, and then you upload your code to it, or use tools provided by the platform, rather than everyone trying to download their own individual copies. It was interesting to compare and contrast the approaches taken with hydrology, polar, coastal, forestry and urban EO data.

Tuesday was always going to be my busiest day of the Symposium as I was chairing two sessions and giving a presentation. I had an early start as the 0800 session on Coastal Zones I was co-chairing alongside Bob Brewin –a former PhD student of mine! It was great to see people presenting their results using Sentinel-2. The spatial resolution, 10m for the highest resolution wavebands, allows us to see the detail of suspended sediment resuspension events and the 705 nm waveband can be used for phytoplankton; but we’d still like an ocean colour sensor at this spatial resolution!

In the afternoon I headed into European Climate Data Records, where there was an interesting presentation on a long time-series AVHRR above-land aerosol dataset where the AVHRR data is being vicariously calibrated using the SeaWiFS ocean colour sensor. Great to see innovation within the industry where sensors launched one set of applications can be reused in others. One thing that was emphasised by presenters in both this session, and the Coastal Zone one earlier, was the need to reprocess datasets to create improved data records.

My last session of the day was on Virtual Research, where I was both co-chairing and presenting. It returned to the theme of handling large datasets, and the presentations focused on building resources that make using EO data easier. This ranged from bringing in-situ and EO data together by standardising the formatting and metadata of the in-situ data, through community datasets for algorithm performance evaluation, to data cubes that bring all the data needed to answer specific questions together into a three- (or higher) dimensional array that means you don’t spend all your time trying to read different datasets versus ask questions of them. My own presentation focused on our involvement with the ESA funded E-Collaboration for Earth Observation (E-CEO) project, which developed a collaborative platform  where challenges can be initiated and evaluated; allowing participants to upload their code and have it evaluated against a range of metrics. We’d run an example challenge focused on the comparison of atmospheric correction processors for ocean colour data that, once setup, could easily be rerun.

I’ve already realised that there too many interesting parallel sessions here, as I missed the ocean colour presentations which I’ve heard were great. The good news for me is that these sessions were recorded. So if you haven’t be able to make to Prague in person, or like me you are here but haven’t seen everything you wanted there are going to be selection of sessions to view on ESA’s site, for example, you can see the opening session here.

Not only do events like this gives you to a fantastic chance learn about what’s happening across the EO community, but they also give you the opportunity to catch up with old friends. I am looking forward to the rest of the week!

Sentinel-3 Sets Sail

Artist's view of Sentinel-3. Image courtesy of ESA–Pierre Carril.

Artist’s view of Sentinel-3. Image courtesy of ESA–Pierre Carril.

At 17.57 GMT yesterday (16th February 2016) Sentinel-3 set sail from the Plesetsk Space Centre in Russia, heading for its 814 km sun-synchronous low Earth orbit. Like all the other Sentinel launches, we were at home watching the live feed!

This is the third Sentinel launch of the European Commission’s Copernicus Programme, following Sentinel-1 and 2. Sentinel-3, like its predecessors, will be part of a twin satellite constellation with Sentinel-3B’s launch expected to be in 2017.

Sentinel-3 carries four scientific instruments:

  • Sea and Land Surface Temperature Radiometer (SLSTR) will measure temperatures of both the sea and land, to an accuracy of better than 0.3 K. This instrument has 9 spectral bands with a spatial resolution of 500 m for visible/near-infrared wavelengths and 1 km for the thermal wavelengths; and has swath widths of 1420 km at nadir and 750 km looking backwards. It’s worth noting that two thermal infrared spectral wavebands are optimised for fire detection, providing the fire radiative power measurement.
  • Ocean and Land Colour Instrument (OLCI) has 21 spectral bands (400–1020 nm) focussed on ocean colour and vegetation measurements. All bands have a spatial resolution of 300 m with a swath width of 1270 km.
  • Synthetic Aperture Radar Altimeter (SRAL) which has dual frequency Ku and C bands. It offers 300 m spatial resolution after SAR processing, and is based on the instruments from the CryoSat and Jason missions. This will be first satellite altimeter to provide 100% coverage of the Earth’s surfaces in SAR mode.
  • Microwave Radiometer (MWR) dual frequency at 23.8 & 36.5 GHz, it is used to derive atmospheric column water vapour measurements for correcting the SRAL instrument.

The scientific instruments are supported by four positioning/navigation instruments to ensure the satellite maintains its precise orbit.

Sentinel-3 will mainly be focussing on ocean measurements and will include the measurement of sea-surface height (similar to the recently launched Jason-3); however it will also measure sea surface temperature, ocean colour, surface wind speed, sea ice thickness and ice sheets. Whereas over land the satellite will provide indices of vegetation, measuring the height of rivers and lakes and help monitor wildfires.

Sentinel-3 is a very exciting satellite for us, as the data and products it will produce are very much within the wheelhouse of the services that Pixalytics offers. Sam’s background is in ocean colour, she’s world renown for atmospheric correction research and we offer a variety of agritech services including vegetation indices. You can probably now see why we’re so excited!

The satellite is currently in its commissioning phases where ESA tests the data produced by the sensors. This is undertaken in conjunction with a group of users, and Pixalytics is one of them! This phase is expected to last five months, after which the satellite will be transferred to Eumetsat and the data should be released.

Like all the data from the Copernicus programme, it will be offered free of charge to users. This will challenge organisations, like us, to see what innovative services we can offer with this new data stream. Exciting times ahead!

Reprocessing Data Challenges of Producing A Time Series

August 2009 Monthly Chlorophyll-a Composite; data courtesy of the ESA Ocean Colour Climate Change Initiative project

August 2009 Monthly Chlorophyll-a Composite; data courtesy of the ESA Ocean Colour Climate Change Initiative project

Being able to look back at how our planet has evolved over time, is one of the greatest assets of satellite remote sensing. With Landsat, you have a forty year archive to examine changes in land use and land cover. For in situ (ground based) monitoring, this is something that’s only available for a few locations, and you’ll only have data for the location you’re measuring. Landsat’s continuous archive is an amazing resource, and it is hoped that the European Union’s Copernicus programme will develop another comprehensive archive. So with all of this data, producing a time series analysis is easy isn’t it?

Well, it’s not quite that simple. There are the basic issues of different missions having different sensors, and so you need to know whether you’re comparing like with like. Although data continuity has been a strong element of Landsat, the sensors on Landsat 8 are very different to those on Landsat 1. Couple this with various positional, projection and datum corrections, and you have lots of things to think about to produce an accurate time series. However, once you’ve sorted all of these out and you’ve got your data downloaded, then everything is great isn’t it?

Well, not necessarily; you’ve still got to consider data archive reprocessing. The Space Agencies, who maintain this data, regularly reprocess satellite datasets. This means that the data you downloaded two years ago, isn’t necessarily the same data that could be downloaded today.

We faced this issue recently as NASA completed the reprocessing of the MODIS Aqua data, which began in 2014. The data from the MODIS Aqua satellite has been reprocessed seven times, whilst its twin, Terra, has been reprocessed three times.

Reprocessing the data can include changes to some, or all, of the following:

  • Update of the instrument calibration, to take account of current knowledge about sensor degradation and radiometric performance.
  • Appyling new knowledge, in terms of atmospheric correction and/or derived product algorithms.
  • Changes to parallel datasets that are used as inputs to the processing; for example, the meteorological conditions are used to aid the atmospheric correction.

Occasionally, they also change the output file format the data is provided in; and this is what has caught us out. The MODIS output file format has changed from HDF4 to NetCDF4 with the reason being that NetCDF is a more efficient, sustainable, extendable and interoperable data file format. A change we’ve known about for a long time, as it resulted from community input, but until you get the new files you can’t check and update your software.

We tend to use a lot of Open Source software, enabling our clients to carry on working with remote sensing products without having to invest in expensive software. The challenge is that it takes software provider time to catch up with the format changes. Hence, the software is unable to load the new files or the data is incorrectly read e.g., comes in upside down. Sometimes large changes, mean you may have to alter your approach and/or software.

Reprocessing is important, as it improves the overall quality of the data, but you do need to keep on top what is happening with the data to ensure that you are comparing like with like when you analyse a time series.

Goodbye HICO, Hello PACE – Ocean Colour’s Satellite Symmetry

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

Ocean colour is the acorn from which Pixalytics eventually grew, and so we were delighted to see last week’s NASA announcement that one of their next generation ocean colour satellites is now more secure with a scheduled launched for 2022.

Unsurprisingly the term ocean colour refers to the study of the colour of the ocean, although in reality it’s a name that includes a suite of different products, with the central one for the open oceans being the concentration of phytoplankton. Ocean colour is determined by the how much of the sun’s energy the ocean scatters and absorbs, which in turn is dependent on the water itself alongside substances within the water that include phytoplankton and suspended sediments together with dissolves substances and chemicals. Phytoplankton can be used a barometer of the health of the oceans; in that phytoplankton are found where nutrient levels are high and oceans with low nutrients have little phytoplankton. Sam’s PhD involved the measurement of suspended sediment coming out of the Humber estuary back in 1995, and it’s remained an active field of her research for the last 20 years.

Satellite ocean colour remote sensing began with the launch of NASA’s Coastal Zone Colour Scanner (CZCS) on the 24th October 1978. It had six spectral bands, four of which were devoted to ocean colour, and a spatial resolution of around 800m. Despite only having an anticipated lifespan of one year, it operated until the 22nd June 1986 and has been used as a key dataset ever since. Sadly, CZCS’s demise marked the start of a decade gap in NASA’s ocean colour data archive.

Although there were some intermediate ocean colour missions, it was the launch of the Sea-viewing Wide Field-of-view (SeaWiFS) satellite that brought the next significant archive of ocean colour data. SeaWiFS had 8 spectral bands optimized for ocean colour and operated at a 1 km spatial resolution. One of Sam’s first jobs was developing a SeaWiFS data processor, and the satellite collected data until the end of its mission in December 2010.

Currently, global ocean colour data primarily comes from either NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) on-board the twin Aqua and Terra satellites, or the Visible Infrared Imaging Radiometer Suite (VIIRS) which is on a joint NOAA / NASA satellite called Suomi NPP. MODIS has 36 spectral bands and spatial resolution ranging from 250 to 1000 m; whilst VIIRS has twenty two spectral bands and a resolution of 375 to 750 m.

Until recently, there was also the ONR / NRL / NASA Hyperspectral Imager for the Coastal Ocean (HICO) mission on-board the International Space Station. It collected selected coastal region data with a spectral resolution range of 380 to 960nm and 90m spatial resolution. It was designed to collect only one scene per orbit and has acquired over 10,000 such scenes since its launch. However, unfortunately it suffered during a solar storm in September 2014. Its retirement was officially announced a few days ago with the confirmation that it wasn’t possible to repair the damage.

In the same week we wave goodbye to HICO, NASA announced the 2022 launch of the Pre-Aerosol and ocean Ecosystem (PACE) mission in a form of ocean colour symmetry. PACE is part of the next generation of ocean colour satellites, and it’s intended to have an ocean ecosystem spectrometer/radiometer called built by NASA’s Goddard Space Flight Centre and will measure spectral wavebands from ultraviolet to near infrared. It will also have an aerosol/cloud polarimeter to help improve our understanding of the flow, and role, of aerosols in the environment.

PACE will be preceded by several other missions with an ocean colour focus including the European Sentinel-3 mission within the next year; it will have an Ocean and Land Colour Instrument with 21 spectral bands and 300 m spatial resolution, and will be building on Envisat’s Medium Resolution Imaging Spectrometer (MERIS) instrument. Sentinel-3 will also carry a Sea and Land Surface Temperature Radiometer and a polarimeter for mapping aerosols and clouds. It should help to significantly improve the quality of the ocean colour data by supporting the improvement of atmospheric correction.

Knowledge the global phytoplankton biomass is critical to understanding the health of the oceans, which in turn impacts on the planet’s carbon cycle and in turn affects the evolution of our planet’s climate. A continuous ocean colour time series data is critical to this, and so we are already looking forward to the data from Sentinel-3 and PACE.

Current Work in Remote Sensing and Photogrammetry

Last week the annual Remote Sensing and Photogrammetry Society (RSPSoc) conference was held in Aberystwyth. Now I’ve stepped down as RSPSoc Chairman I could relax and enjoy this year’s event as a delegate.

Arriving on Wednesday morning, the first session I attended was organised by the Technology and Operational Procedures Special Interest Group (TOPSIG), which was focused on Operational Earth observation. There were a great range of presentations, and I particularly enjoyed the user insights by Andy Wells on how customers are really using imagery. Recent developments in on-the-fly importing, georeferencing and autocorrelation means bringing data together from different sources isn’t a time consuming chore. Users can therefore spend more time analysing data, extracting information and adding value to their organisations or research. In addition, as highlighted by other presentations, open software repositories continue to grow and now include complex algorithms that were once only available to specialists. Finally, Steve Keyworth reminded us that what we do should be seen as a component of the solution rather than the specification; the ultimate aim should be on solving the customer’s problem, which in the current climate is often financially motivated.

Landsat 7 image showing features in the Baltic, data courtesy of ESA

Landsat 7 image showing features in the Baltic, data courtesy of ESA

On Thursday I co-chaired the Water and Marine Environments session alongside Professor Heiko Balzter, on behalf of the Marine Optics Special Interest Group (SIG). My presentation was focused on the European Space Agency (ESA) Landsat archive that’s been acquired via the ESA ground stations. This data is being reprocessed to create a consistent high resolution visible and infrared image dataset combining the three primary sensors used by the series of Landsat satellites; MSS (Multi-spectral Scanner), TM (Thematic Mapper), and ETM+ (Enhanced Thematic Mapper Plus). Although historical Landsat missions are not ideally suited to observing the ocean, due to a low signal-to-noise ratio, features can be clearly seen and the new processing setup means images are being processed over the open ocean.

Mark Danson’s keynote lecture on Friday morning described the application of terrestrial laser scanners to understanding forest structure. He showcased his post PhD research which has led to the development of the Salford Advanced Laser Canopy Analyser, a dual-wavelength full-waveform laser scanner. The presentation also showed the importance of fieldwork in understanding what remote techniques are actually sensing, and in this case included a team of people cutting down example trees and counting every leaf!

Mark also made me feel less guilty that I am still working on a component of my PhD – atmospheric correction. In research your own learning curve, and the scientific process, mean you gain new insights as you understand more, often explaining why answers are not as simple as you might have assumed. It’s one of the reasons why I love doing research.

Overall, I had a great time at RSPSoc, catching up and seeing what’s new in the field. My next conference event is Ocean Optics, in the US, at the end of October where I’ll be discussing citizen science in a marine science context.

Vienna!

Last week I was in Vienna, Austria, attending the 2014 European Geophysical Union (EGU) General Assembly. It was a scientific smorgasbord laid in front of over 12,000 people from 106 countries. Over 4,800 oral presentations were given and 9,500 posters displayed, this was coupled with a variety of other sessions and an exhibition; which created a varied programme. I really liked the plan to create smart umbrellas to collect rain data, which has already received press coverage.

My EGU experience began with a poster summary session on the Thursday morning; these are short three minute presentations giving delegates a flavour of the posters being displayed to encourage people to come and see them. I then moved onto watching presentations and visiting the posters.

Two presentations really caught my eye. The first was about NASA’s upcoming mission Cyclone Global Navigation Satellite System (CYGNSS) which will be studying ocean surface winds using reflected Global Navigation Satellite System (GNSS) signals that are primarily used for positioning, such as within your mobile phone, and timing measurements. This technique, often called GNSS reflectometry, was previously demonstrated on the SSTL’s UK-DMC-1 mission.

The second one focussed on using the altimeter SARAL/AltiKa to study storm Xaver that impacted the southern North Sea / northern Europe with hurricane force winds and a tidal surge at the beginning of December 2013. Launched in February 2013, SARAL/AltiKa is a new collaboration between the French Space Agency (CNES) and Indian Space Research Organization (ISRO) filling a gap left by the loss of ESA’s Envisat as it has the same ground track; while we wait for the Copernicus Sentinel-3 mission that will include altimetry, ocean colour and sea surface temperature instruments.

On the Friday I presented a poster on an ESA project I’m involved with titled E-Collaboration for Earth Observation (E-CEO), which addresses the technologies and architectures needed to provide a collaborative research platform for automating data mining and information extraction experiments. Our aim is to run Earth Observation challenges akin to those used to solve computing tasks, and the poster presented the first of the challenges – focusing on the atmospheric correction of ocean colour imagery.

Home from Hawaii

I got back to a ‘cold’ UK on Saturday afternoon after spending last week at Ocean Sciences 2014.  It was a fantastic conference with over 5,600 attendees.  My scientific highlights were:

The Surface Ocean Layer Atmosphere Study (SOLAS) session on Monday where speakers presented research on the sea surface microlayer (the top 1 mm of the ocean); this layer is important so we can understand the transfer of compounds, such as carbon dioxide, and particles from the ocean to the atmosphere and vice versa that are critical to our interpretation of the climate.

On Tuesday afternoon it was the Optics and Light in the Particle-Laden Coastal Ocean session, with presentations focused on understanding the acoustic and optical signatures of particles, including their shape, from multi-angular measurements and Lidar (laser) profiling of a phytoplankton bloom.

My key session was obviously Optical Remote Sensing of Freshwater, Estuarine and Coastal Environments on Wednesday. I gave a presentation on Multi-Sensor Ocean Colour Atmospheric Correction for Time-Series Data.  Atmospheric correction is the removal of the atmosphere’s signal from data so only the water-leaving radiance signal is left; it allows data to be compared between days irrespective of the weather conditions of that day – so an image taken on a hazy day will look like it was taken on a clear day.

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

Other interesting talks from this session included Tiit Kutser’s presentation on comparing in-situ measurements with MERIS data for dissolved organic carbon and iron concentrates in Lake Malaren in Sweden, Keping Du’s retrieval algorithm for phycocanian, a pigment within cyanobacteria, within Taithu lake in China, Heidi Dierssen’s optics of seagrass for remote sensing and I also really enjoyed my mentee Guangming Zheng’s presentation on suspended sediment within Chesapeake Bay, off the west coast of America – this took me back to my PhD that focussed on the suspended sediment plume from the River Humber.

Finally, there were great presentations by Curt Davis and Nick Tufillaro on the Hyperspectral Imager for the Coastal Ocean (HICO) mission. It’s an experimental mission that’s designed to sample the coastal ocean; one 50 x 200 km scene per orbit at a spatial resolution of around 90 m. The image on the right shows a HICO example.

On top of these oral sessions, I also spent time in the exhibition, poster sessions and some of the evening events.  My last event on the Thursday evening was about getting involved in the European Commission’s Horizon 2020 Research programme – so if anyone needs an Earth Observation specialist partner for their bid, get in touch!