Supporting Chimpanzee Conservation from Space

Gombe National Park, Tanzania. Acquired by Sentinel-2 in December 2016. Image courtesy of ESA.

Being able to visualise the changing face of the planet over time is one of the greatest strengths of satellite remote sensing. Our previous blog showed how Dubai’s coastline has evolved over a decade, and last week NASA described interesting work they’re doing on monitoring habitat loss for chimpanzees in conjunction with the Jane Goodall Institute.

Jane Goodall has spent over fifty years working to protect and conserve chimpanzees from the Gombe National Park in Tanzania, and formed the Jane Goodall Institute in 1977. The Institute works with local communities to provide sustainable conservation programmes.

A hundred years ago more than one million chimpanzees lived in Africa, today the World Wildlife Fund estimate the population may only be around 150,000 to 250,000. The decline is stark. For example, the Ivory Coast populations have declined by 90% within the last twenty years.

One of the key factors contributing to this decline is habitat loss, mostly through deforestation; although other factors such as hunting, disease and illegal capture also contributed.

Forests cover around 31% of the planet, and deforestation occurs when trees are removed and the land has another use instead of being a forest. In chimpanzee habitats, the deforestation is mostly due to logging, mining and drilling for oil. This change in land use can be monitored from space using remote sensing. Satellites produce regular images which can be used to monitor changes in the natural environment, in turn giving valuable information to conservation charities and other organisations.

In 2000 Lilian Pintea, from the Jane Goodall Institute, was shown Landsat images comparing the area around the Gombe National Park in 1972 and 1999. The latter image showed huge deforestation outside the park’s boundary. The Institute have continued to use Landsat imagery to monitor what is happening around the National Park. In 2009 they began a citizen science project with local communities giving them smartphones to report their observations. Combining these with ongoing satellite data from NASA has helped develop and implement local plans for land use and protection of the forests. Further visualisation of this work can be found here. The image at the top was acquired Sentinel-2 in December 2016 and shows the Gombe National Park, although it is under a little haze.

The satellite data supplied by NASA comes from the Landsat missions, which currently have an archive of almost forty-five years of satellite data, which is freely available to anyone. We also used Landsat for data in our Dubai animation last week. Landsat captures optical data, which means it operates in a similar manner to the human eye – although the instruments also have infrared capabilities. However, one drawback of optical instruments is that they cannot see through clouds. Therefore, whilst Landsat is great for monitoring land use when there are clear skies, it can be combined with synthetic aperture radar (SAR), from the microwave spectrum, as it can see through both clouds and smoke. This combination enables land use and land change to monitored anywhere in the world. Using the freely available Landsat and Sentinel-1 SAR data you could monitor what is happening to the forests in your neighbourhoods.

Satellite data is powerful tool for monitoring changes in the environment, and with the archive of data available offers a unique opportunity to see what has happened over the last four decades.

Goodbye to EO-1

Hyperspectral data of fields in South America classified using Principle Components Analysis. Data acquired by Hyperion. Image courtesy of NASA.

In contrast to our previous blog, this week’s is a celebration of the Earth Observing-1 (EO-1) satellite whose death will soon be upon us.

EO-1 was launched on the 21st November 2000 from Vandenberg Air Force Base, California. It has a polar sun-synchronous orbit at a height of 705 km, following the same orbital track as Landsat-7, but lagging one minute behind. It was put into this orbit to allow for a comparison with Landsat 7 images in addition to the evaluation of EO-1’s instruments.

It was the first in NASA’s New Millennium Program Earth Observing series, which had the aim of developing and testing advanced technology and land imaging instruments, particularly related to spatial, spectral and temporal characteristics not previously available.

EO-1 carries three main instruments:

  • Hyperion is an imaging spectrometer which collects data in 220 visible and infrared bands at 30 m spatial resolution with a 7.5 km x 100 km swath. Hyperion has offered a range of benefits to applications such as mining, geology, forestry, agriculture, and environmental management.
  • Advanced Land Imaging (ALI) is a multispectral imager capturing 9 bands at 30 m resolution, plus a panchromatic band at 10 m, with a swath width of 37 km. It has the same seven spectral bands as Landsat 7, although it collects data via a different method. ALI uses a pushbroom technique where the sensor acts like a broom head and collects data along a strip as if a broom was being pushed along the ground. Whereas Landsat operates a whiskbroom approach which involves several linear detectors (i.e., broom heads) perpendicular (at a right angle) to the direction of data collection. These detectors are stationary in the sensor and a mirror underneath sweeps the pixels from left to right reflecting the energy from the Earth into the detectors to collect the data.
  • Atmospheric Corrector (LAC) instrument allows the correction of imagery for atmospheric variability, primarily water vapour, by measuring the actual rate of atmospheric absorption, rather than using estimates.

The original EO-1 mission was only due to be in orbit only one year, but with a sixteen year lifetime it has surpassed all expectations. The extension of the one year mission was driven by the Earth observation user community who were very keen to continue with the data collection, and an agreement was reached with NASA to continue.

Psuedo-true colour hyperspectral data of fields in South America. Data acquired by Hyperion. Image courtesy of NASA.

All the data collect by both Hyperion and ALI is freely available through the USGS Centre for Earth Resources Observation and Science (EROS). At Pixalytics we’ve used Hyperion data for understanding the capabilities of hyperspectral data. The two images shown in the blog are a subset of a scene acquired over fields in South America, with image to the right is a pseudo-true colour composite stretched to show the in-field variability.

Whereas the image at the top is the hyperspectral data classified using a statistical procedure, called Principle Components Analysis (PCA), which extracts patterns from within the dataset. The first three derived uncorrelated variables, termed principle components, are shown as a colour composite.

Sadly, satellites cannot go on forever, and EO-1 is in its final few weeks of life. It stopped accepting data acquisition requests on the 6th January 2017, and will stop providing data by the end of February.

It has been a great satellite, and will be sadly missed.

Earth Observation Looking Good in 2017!

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

2017 is looking like an exciting one for Earth Observation (EO), judging by the number of significant satellites planned for launch this year.

We thought it would be interesting to give an overview of some of the key EO launches we’ve got to look forward to in the next twelve months.

The European Space Agency (ESA) has planned launches of:

  • Sentinel-2B in March, Sentinel-5p in June and Sentinel-3B in August – all of which we discussed last week.
  • ADM-Aeolus satellite is intended to be launched by the end of the year carrying an Atmospheric Laser Doppler Instrument. This is essentially a lidar instrument which will provide global measurements of wind profiles from ground up to the stratosphere with 0.5 to 2 km vertical resolution.

From the US, both NASA and NOAA have important satellite launches:

  • NASA’s Ionospheric Connection Explorer (ICON) Mission is planned for June, and will provide observations of Earth’s ionosphere and thermosphere; exploring the boundary between Earth and space.
  • NASA’s ICESat-2 in November that will measure ice sheet elevation, ice sheet thickness changes and the Earth’s vegetation biomass.
  • In June NOAA will be launching the first of its Joint Polar Satellite System (JPSS) missions, a series of next-generation polar-orbiting weather observatories.
  • Gravity Recovery And Climate Experiment – Follow-On (GRACE_FO) are a pair of twin satellites to extend measurements from the GRACE satellite, maintaining data continuity. These satellites use microwaves to measure the changes in the Earth’s gravity fields to help map changes in the oceans, ice sheets and land masses. It is planned for launch right at the end of 2017, and is a partnership between NASA and the German Research Centre for Geosciences.

Some of the other launches planned include:

  • Kanopus-V-IK is a small Russian remote sensing satellite with an infrared capability to be used for forest fire detection. It has a 5 m by 5 m spatial resolution over a 2000 km swath, and is planned to be launched next month.
  • Vegetation and Environment monitoring on a New MicroSatellite (VENµS), which is partnership between France and Israel has a planned launch of August. As its name suggests it will be monitoring ecosytems, global carbon cycles, land use and land change.
  • KhalifaSat is the third EO satellite of United Arab Emirates Institution for Advanced Science and Technology (EIAST). It is an optical satellite with a spatial resolution of 0.75 m for the visible and near infrared bands.

Finally, one of the most intriguing launches involves three satellites that form the next part of India’s CartoSat mission. These satellites will carry both high resolution multi- spectral imagers and a panchromatic camera, and the mission’s focus is cartography. It’s not these three satellites that make this launch intriguing, it is the one hundred other satellites that will accompany them!

The Indian Space Research Organisation’s Polar Satellite Launch Vehicle, PSLV-C37, will aim to launch a record 103 satellites in one go. Given that the current record for satellites launched in one go is 37, and that over the last few years we’ve only had around two hundred and twenty satellites launched in an entire year; this will be a hugely significant achievement.

So there you go. Not a fully comprehensive list, as I know there will be others, but hopefully it gives you a flavour of what to expect.

It certainly shows that the EO is not slowing down, and the amount of data available is continuing to grow. This of course gives everyone working in the industry more challenges in terms of storage and processing power – but they are good problems to have. Exciting year ahead!

Will Earth Observation’s power base shift in 2017?

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972. Image Credit: NASA

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972.
Image Credit: NASA

We’re only a few days into 2017, but this year may see the start of a seismic shift in the Earth Observation (EO) power base.

We’ve previously described how the sustainable EO industry really began this week thirty nine years ago. On 6th January 1978 NASA deactivated Landsat-1; it had already launched Landsat-2, carrying the same sensors, three years earlier and with guaranteed data continuity our industry effectively began.

Since then the USA, though the data collected by NASA and NOAA satellites, has led the EO global community. This position was cemented in 2008 when it made all Landsat data held by the United States Geological Survey (USGS) freely available, via the internet, to anyone in the world. This gave scientists three decades worth of data to start investigating how the planet had changed, and companies sprang up offering services based entirely on Landsat data. This model of making data freely available has been so transformational, that the European Union decided to follow it with its Copernicus Programme.

Landsat-1 and 2 were followed by 4, 5, 7 & 8 – sadly Landsat 6 never made its orbit – and Landsat 9 is planned for launch in 2020. The USA’s role EO leadership has never been in question, until now.

US President-elect Donald Trump and his team have already made a number of statements indicating that they intended to cut back on NASA’s Earth Science activities. There are a variety of rumours suggesting reasons for this change of approach. However, irrespective of the reason, slashing the current $2 billion Earth Science budget will have huge consequences. Whilst all of this is just conjecture at the moment, the reality will be seen after 20th January.

Against this America backdrop sits the Copernicus Programme, with the European Space Agency due to launch another three satellites this year:

  • Sentinel 2B is planned for March. This is the second of the twin constellation optical satellites offering a spatial resolution of 10 m for the visible bands. The constellation will revisit the same spot over the equator every five days, with a shorter temporal resolution for higher latitudes.
  • June is the scheduled month for the launch of the Sentinel 5 Precursor EO satellite to measure air quality, ozone, pollution and aerosols in the Earth’s atmosphere. This will be used to reduce the data gaps between Envisat, which ended in 2012, and the launch of Sentinel-5.
  • Sentinel 3B is due to launched in the middle of the year, and like 2B is the second in a twin satellite constellation. This pair is mainly focussed on the oceans and measure sea surface topography, sea and land surface temperature, and ocean and land colour. It will provide global coverage every two days with Sea and Land Surface Temperature Radiometer (SLSTR) and the Ocean and Land Colour Instrument (OLCI).

These launches will take give the Copernicus programme seven satellites collecting a wide variety of optical and radar data across the entire planet, which is then made freely available to anyone. It’s obvious to see what will fill any vacuum created by a reduction in Earth Science in the USA.

Depending on how much of the next US President’s rhetoric is turned into action, we may start to see the shift of the EO power base to Europe. Certainly going to be an interesting year ahead!

Small Satellites Step Forward

Artist's concept of one of the eight Cyclone Global Navigation Satellite System satellites deployed in space above a hurricane. Image courtesy of NASA.

Artist’s concept of one of the eight Cyclone Global Navigation Satellite System satellites deployed in space above a hurricane. Image courtesy of NASA.

We’re all about small satellites with this blog, after looking at the big beast that is GOES-R last week. Small satellites, microsatellites, cubesats or one of the other myriad of names they’re described as, have been in the news this month.

Before looking at what’s happening, we’re going to start with some definitions. Despite multiple terms being used interchangeably, they are different and are defined based around either their cubic size or their wet mass – ‘wet mass’ refers to the weight of the satellite including fuel, whereas dry mass is just the weight of satellite:

  • Small satellites (smallsats), also known as minisats, have a wet mass of between 100 and 500 kg.
  • Microsats generally have a wet mass of between 10 and 100 kg.
  • Nanosats have a wet mass of between 1 and 10 kg.
  • Cubesats are a class of nanosats that have a standard size. One Cubesat measures 10x10x10 cm, known as 1U, and has a wet mass of no more than 1.33 kg. However, it is possible to join multiple cubes together to form a larger single unit.
  • Picosats have a wet mass of between 0.1 and 1 kg
  • Femtosats have a wet mass of between 10 and 100 g

To give a comparison, GOES-R had a wet mass of 5 192 kg, a dry mass of 2 857 kg, and a size of 6.1 m x 5.6 m x 3.9 m.

Small satellites have made headlines for a number of reasons, and the first two came out of a NASA press briefing given by Michael Freilich, Director of NASA’s Earth science division on the 7th November. NASA is due to launch the Cyclone Global Navigation Satellite System (CYGNSS) on 12th December from Cape Canaveral. CYGNSS will be NASA’s first Earth Observation (EO) small satellite constellation. The mission will measure wind speeds over the oceans, which will be used to improve understanding, and forecasting, of hurricanes and storm surges.

The constellation will consist of eight small satellites in low Earth orbits, which will be focussed over the tropics rather than the whole planet. Successive satellites in the constellation will pass over the same area every twelve minutes, enabling an image of wind speed over the entire tropics every few hours.

Each satellite will carry a Delay Doppler Mapping Instrument (DDMI) which will receive signals from existing GPS satellites and the reflection of that same signal from the Earth. The scattered signal from the Earth will measure ocean roughness, from which wind speed can be derived. Each microsatellite will weigh around 29 kg and measure approximately 51 x 64 x 28 cm; on top of this will be solar panels with a span of 1.67 m.

The second interesting announcement as reported by Space News, was that NASA is planning to purchase EO data from other small satellite constellation providers, to assess the quality and usability of that data. They will be one-off purchases with no ongoing commitment, and will sit alongside data from existing NASA missions. However, it is difficult not to assume that a successful and cost effective trial could lead to ongoing purchases, which could replace future NASA missions.

It’s forecast that this initiative could be worth in the region of $25 million, and will surely interest the existing suppliers such as Planet or TerraBella; however, in the longer term it could also attract new players to the market.

Finally in non NASA small satellite news, there was joint announcement at the start of the month by the BRICS states (Brazil, Russia, India, China and South Africa) that they’d agreed to create a joint satellite constellation for EO. No further detail is available at this stage.

Once again, this shows what a vibrant, changing and evolving industry we work in!

Remote Sensing: Learning, Learned & Rewritten

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

This blog post is about what I did and what thoughts came to mind on my three-month long ERASMUS+ internship at Pixalytics which began in July and ends this week.

During my first week at Pixalytics, after being introduced to the Plymouth Science Park buildings and the office, my first task was to get a basic understanding of what remote sensing is actually about. With the help of Sam and Andy’s book, Practical Handbook of Remote Sensing, that was pretty straightforward.

As the words suggest, remote sensing is the acquisition of data and information on an object without the need of being on the site. It is then possible to perform a variety of analysis and processing on this data to better understand and study physical, chemical and biological phenomena that affect the environment.

Examples of programming languages: C, Python & IDL

Examples of programming languages: C, Python & IDL

I soon realized that quite a lot of programming was involved in the analysis of satellite data. In my point of view, though, some of the scripts, written in IDL (Interactive Data Language), were not as fast and efficient as they could be, sometimes not at all. With that in mind, I decided to rewrite one of the scripts, turning it into a C program. This allowed me to get a deeper understanding of satellite datasets formats (e.g. HDF, Hierarchical Data Format) and improve my overall knowledge of remote sensing.

While IDL, a historic highly scientific language for remote sensing, provides a quick way of writing code, it has a number of glaring downsides. Poor memory management and complete lack of strictness often lead to scripts that will easily break. Also, it’s quite easy to write not-so-pretty and confusing spaghetti code, i.e., twisted and tangled code.

Writing C code, on the other hand, can get overly complicated and tedious for some tasks that would require just a few lines in IDL. While it gives the programmer almost full control of what’s going on, some times it’s just not worth the time and effort.

Instead, I chose to rewrite the scripts in Python which I found to be quite a good compromise. Indentation can sometimes be a bit annoying, and coming from other languages the syntax might seem unusual, but its great community and the large availability of modules to achieve your goals in just a few lines really make up for it.

It was soon time to switch to a bigger and more complex task, which has been, to this day, what I would call my “main task” during my time at Pixalytics: building an automated online processing website. The website aspect was relatively easy with a combination of the usual HTML, Javascript, PHP and CSS, it was rewriting and integrated the remote sensing scripts that was difficult. Finally all of those little, and sometimes not quite so little, scripts and programs were available from a convenient web interface, bringing much satisfaction and pride for all those hours of heavy thinking and brainstorming. Hopefully, you will read more about this development in the future from Pixalytics, as it will form the back-end of their product suite to be launched in the near future.

During my internship there was also time for events inside the Science Park such as the Hog Roast, and events outside as well when I participated at the South-West England QGIS User Group meeting in Dartmoor National Park. While it is not exactly about remote sensing, but more on the Geographic Information System (GIS) topic it made me realize how much I had learned on remote sensing in my short time at Pixalytics, I was able to exchange my opinions and points of view with other people that were keen on the subject.

A side project I’ve been working on in my final weeks was looking at the world to find stunning, interesting (and possibly both) places on Earth to make postcards from – such as one at the top of the blog. At times, programming and scientific research reads can get challenging and/or frustrating, and it’s so relaxing to just look at and enjoy the beauty of our planet.

It is something that anyone can do as it takes little knowledge about remote sensing. Free satellite imagery is available through a variety of sources; what I found to be quite easy to access and use was imagery from USGS/NASA Landsat-8 and ESA Sentinel-2. It is definitely something I would recommend.

Finally, I want to say “thank you” to Sam and Andy, without whom I would have never had the opportunity to get the most out of this experience, in a field in which I’ve always been interested into, but had never had the chance to actually get my hands on.

Blog written by Davide Mainas on an ERASMUS+ internship with Pixalytics via the Tellus Group.

Monitoring ocean acidification from space

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

What is ocean acidification?
Since the industrial revolution the oceans have absorbed approximately 50% of the CO2 produced by human activities (The Royal Society, 2005). Scientists previously saw this oceanic absorption as advantageous, however ocean observations in recent decades have shown it has caused a profound change in the ocean chemistry – resulting in ocean acidification (OA); as CO2 dissolves into the oceans it forms carbonic acid, lowering the pH and moving the oceans into a more acidic state. According to the National Oceanic Atmospheric Administration (NOAA) ocean pH has already decreased by about 30% and some studies suggest that if no changes are made, by 2100, ocean pH will decrease by 150%.

Impacts of OA
It’s anticipated OA will impact many marine species. For example, it’s expected it will have a harmful effect on some calcifying species such as corals, oysters, crustaceans, and calcareous plankton e.g. coccolithophores.

OA can significantly reduce the ability of reef-building corals to produce their skeletons and can cause the dissolution of oyster’s and crustacean’s protective shells, making them more susceptible to predation and death. This in turn would affect the entire food web, the wider environment and would have many socio-economic impacts.

Calcifying phytoplankton, such as coccolithophores, are thought to be especially vulnerable to OA. They are the most abundant type of calcifying phytoplankton in the ocean, and are important for the global biogeochemical cycling of carbon and are the base of many marine food webs. It’s projected that OA may disrupt the formation and/or dissolution of coccolithophores, calcium carbonate (CaCO3) shells, impacting future populations. Thus, changes in their abundance due to OA could have far-reaching effects.

Unlike other phytoplankton, coccolithophores are highly effective light scatterers relative to their surroundings due to their production of highly reflective calcium carbonate plates. This allows them to be easily seen on satellite imagery. The figure at the top of this page shows multiple coccolithophore blooms, in light blue, off the coast of the United Kingdom on 24th March 2016.

Current OA monitoring methods
Presently, the monitoring of OA and its effects are predominantly carried out by in situ observations from ships and moorings using buoys and wave gliders for example. Although vital, in situ data is notoriously spatially sparse as it is difficult to take measurements in certain areas of the world, especially in hostile regions (e.g. Polar Oceans). On their own they do not provide a comprehensive and cost-effective way to monitor OA globally. Consequently, this has driven the development of satellite-based sensors.

How can OA be monitored from space?
Although it is difficult to directly monitor changes in ocean pH using remote sensing, satellites can measure sea surface temperature and salinity (SST & SSS) and surface chlorophyll-a, from which ocean pH can be estimated using empirical relationships derived from in situ data. Although surface measurements may not be representative of deeper biological processes, surface observations are important for OA because the change in pH occurs at the surface first.

In 2015 researchers at the University of Exeter, UK became the first scientists to use remote sensing to develop a worldwide map of the ocean’s acidity using satellite imagery from the European Space Agency’s Soil Moisture and Ocean Salinity (SMOS) satellite that was launched in 2009 and NASA’s Aquarius satellite that was launched in 2011; both are still currently in operation. Thermal mounted sensors on the satellites measure the SST while the microwave sensors measure SSS; there are also microwave SST sensors, but they have a coarse spatial resolution.

Future Opportunities – The Copernicus Program
The European Union’s Copernicus Programme is in the process of launching a series of satellites, known as Sentinel satellites, which will improve understanding of large scale global dynamics and climate change. Of all the Sentinel satellite types, Sentinels 2 and 3 are most appropriate for assessment of the marine carbonate system. The Sentinel-3 satellite was launched in February this year andwill be mainly focussing on ocean measurements, including SST, ocean colour and chlorophyll-a.

Overall, OA is a relatively new field of research, with most of the studies being conducted over the last decade. It’s certain that remote sensing will have an exciting and important role to play in the future monitoring of this issue and its effects on the marine environment.

Blog written by Charlie Leaman, BSc, University of Bath during work placement at Pixalytics.

Playboy Magazine & Remote Sensing

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972. Image Credit: NASA

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972.
Image Credit: NASA

Are you aware the role Playboy Magazine has had in the remote sensing and image processing industries? Anyone who has read a selection of image processing books or journals will probably recognise the Lena picture as a standard test image. If you don’t know the image, you can find it here. Lena’s history is interesting.

It began in 1973 when Alexander Sawchuk, who was then an assistant professor at the USC Signal and Image Processing Institute, was part of a small team searching for a human face to scan for a colleague’s conference paper. They wanted a glossy image to get a good output dynamic range and during the search someone walked in with the November 1972 issue of Playboy. They used the centrefold image, the Swedish model Lena Söderberg, so they could wrap it around the drum of their scanner. As they only needed a 512 x 512 image, they scanned the top 5.12 inches of the picture, creating a head shot rather than the original full nude centrefold.

From this beginning Lena, often called Lenna as this was the forename used in Playboy, has gone to be one of the most commonly used standard test images. There are a number of theories of why this is the case, including:

  • The image has a good variety of different textual elements, such as light and dark, fuzzy and sharp, detailed and flat.
  • The grayscale version contains all the middle grays.
  • She has a symmetrical face, making any errors easy to see.
  • The image processing community is predominantly male!

Most often the image is used for compression testing, but has also used been used in the analysis of a wide variety of other techniques such as the application of filtering for edge enhancement. Even as recently as three years ago a group of scientists from Singapore shrunk the Lena image down to the width of a human hair as a demonstration of nanotechnology printing.

The wide use of Lena eventually came to the notice of Playboy, after the magazine Optical Engineering put her on their front cover in 1991. The Playboy organisation then tried to assert their copyright, however the genie was out of that bottle given the sheer number of people using it. The following year Optical Engineering reached an agreement with Playboy to continue using the image for scientific research and education. The copyright issues are why we didn’t include the Lena image on the blog, although has been reported that Playboy now overlook the use of the Lena for image processing we decided not to risk it! Playboy did help in the search for Lena in 1997 which enabled her to make a public appearance at the 50th Annual Conference of the Society for Imaging Science in Technology. An article written by Jamie Hutchinson giving a more detailed version of the Lena story can be found here.

What’s interesting about Lena is that despite all the technological advancements in the last forty years, she is still used as a standard testing image. Contrast this with the famous Blue Marble image of the Earth taken around the same time by astronauts aboard Apollo 17. The 1972 Blue Marble is probably the most iconic picture of the Earth, and unlike Lena has inspired numerous later images. For example, NASA used the Terra satellite to produce a detailed true-colour image of the Earth in 2002 and then three years later surpassed it with a new image that had twice as much detail as the original. The latest NASA Blue Marble was issued last year, captured by the US DSCOVR Earth observation satellite.

Standard test images are important, but the image processing community should probably start to think about updating the ones we use. Anyone got any ideas?

Sentinel’s Milestone and Millstone

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

There was be a significant milestone achieved for the European Commission’s Copernicus Programme with the launch of the Sentinel-1B satellite. It was the fourth satellite launched, and will complete the first of the planned constellations as the second Sentinel-1 satellite.

It was launched on 25th April from French Guiana. In addition, to Sentinel-1B, three student cubesats were onboard the Soyuz rocket. Students from the University of Liege, Polytechnic of Turin, Italy, and the University of Aalborg have developed 10cm square cubesats as part of ESA’s ‘Fly Your Satellite!’ programme which will be deployed into orbit.

Sentinel-1B is an identical twin to Sentinel-1A which was launched on the 3rd April 2014, and they will operate as a pair constellation orbiting 180 degrees apart at an altitude of approximately 700 km. They both carry a C-band Synthetic Aperture Radar (SAR) instrument and together will cover the entire planet every six days, although the Arctic will be revisited every day and Europe, Canada and main shipping routes every three days.

Sentinel-1 data has a variety of applications including monitoring sea ice, maritime surveillance, disaster humanitarian aid, mapping for forest, water and soil management. The benefits were demonstrated this week with:

  • Issuing a video showing the drop in rice-growing productivity in Mekong River Delta over the last year; and
  • The multi-temporal colour composite of land coverage of Ireland as shown at the top of this post. It was created from 16 radar scans over 12 days during May 2015, where:
    • The blues represent changes in water or agricultural activities such as ploughing, the yellows represent urban centres, vegetated fields and forests appear in green and the reds and oranges represent unchanging features such as bare soil.

With this constellation up and working, the revisit speed has the chance to be the game changer in the uptake of space generated data.

Sadly there’s a millstone hanging around the Copernicus Programme neck hindering this change – accessing the data remains difficult for commercial organisations.

Currently, selecting and downloading Sentinel data is a painful process, one that mostly either does not work, or is so slow you give up on it! This has been created by the size of the datasets and popularity of the data that’s free to access for everyone worldwide.

There are a number of ways of getting access to this data, with varying success in our experience, including:

  • EU’s Copernicus Hub – Operational, but slow to use. Once you have selected the data to download, either manually or via a script, the process is extremely slow and often times out before completing the downloading.
  • USGS – Offers Sentinel-2, but not Sentinel-1, data via it’s EarthExplorer and Glovis interfaces. The download process is easier, but the format of Sentinel-2 makes searching a bit strange in Glovis and it’s only a partial representation of the available acquisitions.
  • The UK Collaborative Ground Segment Access, despite signing an agreement with ESA in March 2015, has not yet been made available for commercial entities.
  • It is possible to apply for access to the academically focused STFC Centre for Environmental Data Analysis (CEDA) system, which provides FTP access, and that has good download speed’s for the data that’s available.
  • Amazon’s archive of Sentinel-2 data which has good download speeds, but is cumbersome to search without the development of software i.e. scripts.

There are also further services and routes being developed to facilitate searching and downloading from the various archives, e.g., there’s a QGIS ‘Semi-Automatic Classification’ plugin and EOProc SatCat service for Sentinel-2. With the Sentinel-3A data coming online soon the situation will get more complex for those of us trying to use data from all the Sentinel missions.

Getting the satellites into space is great, but that is only the first step in widening the use of space generated data. Until the data is put into the hands of people who use it to create value and inspire people, the Sentinel data will not fulfill its full potential in widening the use of space generated data.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!