Have you read the top Pixalytics blogs of 2016?

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

As this is the final blog of the year we’d like to take a look back over the past fifty-two weeks and see which blog’s captured people’s attention, and conversely which did not!

It turns out that seven of the ten most widely viewed blogs of the last year weren’t even written in 2016. Four were written in 2015, and three were written in 2014! The other obvious trend is the interest in the number of satellites in space, which can be seen by the titles of six of the ten most widely read blogs:

We’ve also found these blogs quoted by a variety of other web pages, and the occasional report. It’s always interesting to see where we’re quoted!

The other most read blogs of the year were:

Whilst only three of 2016’s blogs made our top ten, this is partly understandable as they have less time to attract the interest of readers and Google. However, looking at most read blogs of 2016 shows an interest in the growth of the Earth Observation market, Brexit, different types of data and Playboy!

We’ve now completed three years of weekly blogs, and the views on our website have grown steadily. This year has seen a significant increase in viewed pages, which is something we’re delighted to see.

We like our blog to be of interest to our colleagues in remote sensing and Earth observation, although we also touch on issues of interest to the wide space, and small business, communities.

At Pixalytics we believe strongly in education and training in both science and remote sensing, together with supporting early career scientists. As such we have a number of students and scientists working with us during the year, and we always like them to write a blog. Something they’re not always keen on at the start! This year we’ve had pieces on:

Writing a blog each week can be hard work, as Wednesday mornings always seem to come around very quickly. However, we think this work adds value to our business and makes a small contribution to explaining the industry in which we work.

Thanks for reading this year, and we hope we can catch your interest again next year.

We’d like to wish everyone a Happy New Year, and a very successful 2017!

Rio Olympics from space

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

The Opening Ceremony of the 2016 Summer Olympics takes place on Friday and so we’ve decided to revive our highly infrequent blog series ‘Can you see sporting venues from space?’ Previously we’ve looked for the Singapore and Abu Dhabi Formula One Grand Prix Circuits, but this week we’re focussing on the Rio Olympic venues.

Rio de Janeiro
The Games of the XXXI Olympiad will take place from the 5th to the 21st August in the Brazilian city of Rio de Janeiro. It is expected that more than ten thousand athletes will be competing for the 306 Olympic titles across 37 venues, 7 of which are temporary venues and 5 are outside Rio. The remaining twenty-five are permanent venues within the city, and 11 have been newly built for the Olympics and Paralympics. It is these permanent venues that we’ll see if we can spot from space!

The image at the top of the blog shows the Rio area, and you’ll notice the dark green area in the centre of the image which is the Tijuca National Park containing one of the world’s largest urban rainforest. It covers an area of 32 km².

Spatial Resolution
Spatial resolution is the key characteristic in whether sporting venues can be seen from space, and in simplistic terms it refers to the smallest object that can be seen on Earth from that sensor. For example, an instrument with a 10 m spatial resolution means that each pixel on its image represents 10 m, and therefore for something to be distinguishable on that image it needs to be larger than 10 m in size. There are exceptions to this rule, such as gas flares, which are so bright that they can dominate a much larger pixel.

We used the phrase ‘simplistic terms’ above because technically, the sensor in the satellite doesn’t actually see a square pixel, instead it sees an ellipse due to the angle through which it receives the signal. The ellipses are turned into square pixels by data processing to create the image. Spatial resolution is generally considered to have four categories:

  • Low spatial resolution: tend to have pixels between 50 m and 1 km.
  • Medium spatial resolution: tend to have pixels between 4 m and 50 m.
  • High spatial resolution: tend to have pixels between 1 m and 4 m.
  • Very high spatial resolution: tend to have pixels between 0.25 m to 1 m

Clearly with very high resolution imagery, such as that provided by commercial Worldview satellites owned by DigitalGlobe, can provide great images of the Olympic venues. However, as you know we like to work with data that is free-to-access, rather than paid for data. We’ve used Sentinel-2 data for this blog, which has a 10 m spatial resolution for its visible and near infra-red bands via the multispectral imager it carries.

Can we see the Olympic venues from space?
In our earlier parts of this infrequent series we couldn’t see the night race from the Singapore circuit, but we did identify the Abu Dhabi track and red roof of the Ferrari World theme park. So can we see the Olympics? Actually we can!

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

On the image to left, you’ll notice two bright white circles, one in the middle of the image and the second to the south-east. The bright circle in the middle is the Olympic Stadium which will be hosting the athletics and stands out clearly from the buildings surrounding it, to the South East is the Maracanã Stadium which will stage the opening and closing ceremonies together with the finals of the football tournaments.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the bottom left of the image is small triangular shape which is location for the Aquatics Stadium, Olympic Tennis Centre, the Gymnastic and Wheelchair basketball arena, and the Carioca arenas which will host basketball, judo, wrestling and boccia. The bottom of the triangle juts out into the Jacarepagua Lagoon.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the top left of the image, you can see the runway of the military Afonsos Air Force Base and north of the air base are a number of other Olympic venues, however these are hard to spot within their surroundings – these include the Equestrian Centre, Hockey Centre, BMX Centre, Whitewater canoe slalom course and the Deodoro stadium which will host the Rugby 7s and modern pentathlon.

It is possible to see the Olympic venues from space! Good luck to all the athletics competing over the next few weeks.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!

Temporal: The forgotten resolution

Time, Copyright: scanrail / 123RF Stock Photo

Time, Copyright: scanrail / 123RF Stock Photo

Temporal resolution shouldn’t be forgotten when considering satellite imagery; however it’s often neglected, with its partners of spatial and spectral resolution getting the limelight. The reason is the special relationship spatial and spectral has, where a higher spectral resolution has meant a lower spatial resolution and vice-versa, because of limited satellite disk space and transmission capabilities. Therefore, when considering imagery most people focus on their spatial or spectral needs and go with whatever best suits their needs, rarely giving temporal resolution a second thought, other than if immediate data acquisition is required.

Temporal resolution is the amount of time it takes a satellite to return to collect data for exactly the same location on Earth, also known as the revisit or recycle time, expressed as a function of time in hours or days. Global coverage satellites tend to have low earth polar, or near-polar, orbits travelling at around 27,000kph and taking around 100 minutes to circle the Earth. With each orbit the Earth rotates twenty-five degrees around its polar axis, and so on each successive orbit the ground track moves to the west, meaning it takes a couple of weeks to fully rotate, for example, Landsat has a 16 day absolute revisit time.

Only seeing the part of the Earth you want to image once every few weeks, isn’t very helpful if you want to see daily changes. Therefore, there are a number of techniques satellites use to improve the temporal resolution:

  • Swath Width– A swath is the area of ground the satellite sees with each orbit, the wider the swath the greater the ground coverage, but generally a wider swath means lower spatial resolution. A satellite with a wide swath will have significant overlaps between orbits that allows areas of the Earth to be imaged more frequently, reducing the revisit time. MODIS uses a wide swath and it images the globe every one to two days.
  • Constellations – If you have two identical satellites orbiting one hundred and eighty degrees apart you will reduce revisit times, and this approach is being used by ESA’s Sentinel missions. Sentinel-1A was launched in 2014, with its twin Sentinel-1B is due to be launched in 2016. When operating together they will provide a temporal resolution of six days. Obviously, adding more satellites to the constellations will continue to reduce the revisit time.
  • Pointing – High-resolution satellites in particular use this method, which allows the satellites to point their sensors at a particular point on earth, and so can map the same area from multiple orbits. However, pointing changes the angle the sensor looks at the Earth, and means the ground area it can observe can be distorted.
  • Geostationary Orbits – Although technically not the same, a geostationary satellite remains focussed on an area of the Earth at all times and so the temporal resolution is the number of times imagery is taken, for example, every fifteen minutes. The problem is that you can only map a restricted area.

Hopefully, this has given you a little oversight on temporal resolution, and whilst spectral and spatial resolution are important factors when considering what imagery you need; do spent a bit a time considering temporal needs too!

SMAP ready to map!

Artist's rendering of the Soil Moisture Active Passive satellite.  Image credit: NASA/JPL-Caltech

Artist’s rendering of the Soil Moisture Active Passive satellite.
Image credit: NASA/JPL-Caltech

On the 31st January NASA launched their Soil Moisture Active Passive satellite, generally known by the more pronounceable acronym SMAP, aboard the Delta 2 rocket. It will go into a near polar sun-synchronous orbit at an altitude of 685km.

The SMAP mission will measure the amount of water in the top five centimetres of soil, and whether the ground is frozen or not. These two measurements will be combined to produce global maps of soil moisture to improve understanding of the water, carbon and energy cycles. This data will support applications ranging from weather forecasting, monitoring droughts, flood prediction and crop productivity, as well as providing valuable information to climate science.

The satellite carries two instruments; a passive L-Band radiometer and an active L-Band synthetic aperture radar (SAR). Once in space the satellite will deploy a spinning 6m gold-coated mesh antenna which will measure the backscatter of radar pulses, and the naturally occurring microwave emissions, from off the Earth’s surface. Rotating 14.6 times every minute, the antenna will provide overlapping loops of 1000km giving a wide measurement swath. This means that whilst the satellite itself only has an eight day repeat cycle, SMAP will take global measurements every two to three days.

Interestingly, although antennas have previously been used in large communication satellites, this will be the first time a deployable antenna, and the first time a spinning application, have been used for scientific measurement.

The radiometer has a high soil moisture measurement accuracy, but has a spatial resolution of only 40km; whereas the SAR instrument has much higher spatial resolution of 10km, but with lower soil moisture measurement sensitivity. Combining the passive and active observations will give measurements of soil moisture at 10km, and freeze/thaw ground state at 3km. Whilst SMAP is focussed on provided on mapping Earth’s non-water surface, it’s also anticipated to provide valuable data on ocean salinity.

SMAP will provide data about soil moisture content across the world, the variability of which is not currently well understood. However, it’s vital to understanding both the water and carbon cycles that impact our weather and climate.

Why counting animals from spaces isn’t as hard as you think

Great Migration in Maasai Mara National Park, Kenya

Great Migration in Maasai Mara National Park, Kenya; copyright alextara / 123RF Stock Photo

Last week the keepers at London Zoo were busy counting their 17,000 animals, as part of the annual headcount. Knowing numbers is vital within the wild too, but counting animals on the plains of Africa is more challenging. Traditionally wild counts are either ground surveys, which take people and time; or aerial surveys, that can spook the animals. Satellite remote sensing could offer a potential solution, but it’s not straight-forward. Three papers published in 2014 show the possibilities, and challenges, of using satellites to count animals.

The paper Spotting East African Mammals in Open Savannah from Space by Zheng Yang et al (2014) published on the 31st December, describes the use of very high-resolution GeoEye-1 satellite images to detecting large animals in the Maasai Mara National Reserve, Kenya. GeoEye-1’s 2m multispectral image resolution was not sufficient to detect large animals. However, when combined with the panchromatic image using a pan sharpening technique the resolution improved to 0.5m meaning adult wildebeests and zebras were 3 to 4 pixels long, and 1 to 2 pixels wide. Experienced Kenyan wildlife researchers initially visually reviewed images to develop a classification system, forming the basis of a hybrid image system, using both pixel-based and object-based image assessment approaches to determine which pixels belonged to animals. The results showed an average count error of 8.2% compared to manual counts, with an omission error rate of 6.6%, which demonstrates that satellites have potential for use in counting; it’s cheaper and less intrusive than existing methods.

The second paper was published by Seth Stapleton et al (2014) entitled Assessing Satellite Imagery as a Tool to Track Arctic Wildlife. It used 0.5m resolution imagery of Rowley Island in Foxe Basin, Canada, from Worldview-2 to monitor the island’s polar bear population. The images were corrected for terrain and solar irradiance, and an a histogram stretch to brighten darker, non-ice, areas to assist human analysts identify the bears. Two observers visually identified ‘presumed bears’ both individually and jointly; resulting in the identification of 92 presumed bears. This satellite derived figure was consistent with other models, again offering a potential cheaper and safer way of monitoring polar bears.

Finally, Peter Fretwell et al (2014) published Counting Southern Right Whales by Satellite. Also using WorldView-2, they used a 2m resolution image with eight colour bands and one panchromatic band. The images were analysed using ENVI5 and ArcGIS to identify potential and probable whales, and then visual inspection of these images showed they had identified objects of the right shape and size to be whales; resulting in the identification of 55 probable whales and 23 possible whales. Again, showing satellite images could be useful in calculating whale populations faster and more efficiently.

All three of these papers demonstrate that satellite remote sensing has potential to assist in the monitoring of animal species across the globe. However, there are also significant challenges still to overcome, for example:

  • Resolution: Currently available resolutions may not sufficient to distinguish the level of detail conservationists need, such as species identification in Africa or polar bear cubs in the Canada. However, it may be possible with very high resolution satellites such as the planned WordlView-4 from DigitalGlobe.
  • Cloud cover: The persistent nemesis of optical Earth observation imagery may hamper it’s use in certain areas or seasons.
  • Complicated environments: Further research is needed to ensure animals can be accurately distinguished from their surroundings.

Despite these reservations, the potential to offer regularly, more efficient and safer methods of survey animal populations from space means this will be a rapidly developing area of Earth observation.

The Small and Mighty Proba Missions

This week the European Space Agency announced the latest mission in the Project for OnBoard Automony (PROBA) mini-satellite programme. Proba-3 is planned to launch in four years; and will be a pair of satellites flying in close formation, 150m apart, with the front satellite creating an artificial eclipse of the sun allowing its companion views of the solar corona; normally only visible momentarily during solar eclipses.

Tamar estuary captured in October 2005, data courtesy of ESA.

Tamar estuary captured in October 2005, data courtesy of ESA.

The Proba missions are part of ESA’s In-orbit Technology Demonstration Programme, which focuses on testing, and using, innovative technologies in space. Despite Proba-3’s nomenclature, it will be the fourth mission in the Proba programme. The first, Proba-1, was launched on the 22nd October 2001 on a planned two year Earth observation (EO) mission; however despite the planned lifecycle, thirteen years later it is still flying and sending back EO data. It’s in a sun synchronous orbit with a seven-day repeat cycle and carries eight instruments. The main one is the Compact High Resolution Imaging Spectrometer (CHRIS), developed in the UK by the Space Group of Sira Technology Ltd that was later acquired by Surrey Satellite Technology Limited. CHRIS is a hyperspectral sensor that acquires a set of up to five images of a target, with different modes allowing the collection of up to 62 spectral wavebands.

Plymouth, where Pixalytics is based, and our lead consultant, Dr Samantha Lavender, have a long history with Proba-1. Rame Head point, along the coast from Plymouth, is one of the test sites for the CHRIS instrument and she’s been doing research using the data it provides for over a decade. Over Plymouth Mode 2 is used, which focuses on mapping the water at a spatial resolution of 17m; this mode was proposed by Sam back in the early days of CHRIS-Proba. The image at the top of the page, captured in October 2005, shows the Tamar estuary in the UK that separates the counties of Devon and Cornwall; for this image CHRIS was pointed further North due to planned fieldwork activities. At the bottom of the image is the thick line of the Tamar Road Bridge and below it, the thinner Brunel railway bridge. Plymouth is to the right of the bridge, and to the left is the Cornish town of Saltash.

Proba-V image of the Nile Delta in Egypt, courtesy of the Belgian PROBA-V / ESA Earth Watch programmes

Proba-V image of the Nile Delta in Egypt, courtesy of the Belgian PROBA-V / ESA Earth Watch programmes

Proba-2 was launched in 2009, carrying two solar observation experiments, two space weather experiments and seventeen other technology demonstrations. ESA returned to EO for the third mission, Proba-V, launched on the 7 May 2013; the change in nomenclature is because the V stands for vegetation sensor. It is a redesign of the ‘Vegetation’ imaging instrument carried on the French Spot satellites; it has a 350m ground resolution with a 2250km swath, and collects data in the blue, red, near-infrared and mid-infrared wavebands. It provides worldwide coverage every two days, and through its four spectral bands it can distinguish between different types of land cover. The image on the right is from Proba-V, showing the Nile delta on 2nd May 2014.

Despite their small stature all the Proba satellites are showing their resilience by remaining operational, and they’re playing a vital role in allowing innovative new technologies to be tested in space.

Can Earth Observation answer your question?

The opportunities and challenges of utilising Earth observation (EO) data played out in microcosm in our house over the weekend. On Sunday afternoon, I was watching highlights of the Formula One Singapore Grand Prix which takes place on the harbour streets of Marina Bay and is the only night race of the season. To ensure the drivers can see, there are over 1,500 light projectors installed around the circuit giving an illumination of around 3,000 lux.

Whilst watching I wondered aloud whether we’d be able to see the track from space with the additional floodlights. My idle wondering caught Sam’s interest far more than the actual race and she decided to see if she could answer the question. The entire circuit is just over five kilometres long, but it’s a loop and so an approximate two kilometre footprint; any imagery would need a spatial resolution less than this. The final difficulty is that the data needed to be this weekend, as the circuit is only floodlit for the racing.

Within a few laps Sam had identified free near real time night data available from United States National Oceanic & Atmospheric Administration (NOAA) which covered the required area and timeframe. This was from the Visible Infrared Imaging Radiometer Suite (VIIRS) using it’s Day/Night band with a 750m spatial resolution – this resolution meant we would not be able to see the outline of the track as it would be represented by only three or four pixels, but it would be interesting to see if we could identify the track feature. By the end of the race Sam had selected and downloaded the data, and so we could answer my question. However, it turned out to be not quite that easy.

VIIRS Singapore night time imagery, data courtesy of NOAA

VIIRS Singapore night time imagery, data courtesy of NOAA

NOAA data uses a slightly different format to the image processing packages we had, and we couldn’t initially see what we’d downloaded. Sam had to write some computer code to modify the packages to read the NOAA data. For anyone thinking this is an odd way to spend a Sunday evening, to Sam this was a puzzle to solve and she was enjoying herself! After some rapid coding we were able to view the image, but unfortunately the Saturday data wasn’t useful. On Monday we tried again, the Sunday race took place on a clear night and we’ve got a good image of the area, which you can see above. On the larger image you can clearly the Indonesian Islands with Jakarta shining brightly, up through the Java Sea where the lights of some ships are visible and then at the top of the image is Singapore; the zoomed in version of Singapore is the inset image.

Despite the floodlights used for the race, Singapore and some of the surrounding Malaysian cities are so bright at night that the additional lights simply contribute to the overall illumination, rather than making the track stand out. Hence the answer to my question is that the 2014 floodlit Singapore F1 street circuit can’t be distinguished from the surrounding area at this spatial resolution. Of course if we purchased high resolution imagery we may be able to see more detail, but we thought that was going a bit far for my idle wondering!

EO can answer questions like these quickly; and whilst we know not many businesses are dependent on whether the Singapore Grand Prix can be seen from space, but change this to what is the light pollution in your area, what is happening in terms of deforestation in the middle of the jungle, what phytoplankton are doing in the middle of the ocean or whatever question you might have, then EO might be able to provide the answer in a short space of time.

However, there are two main difficulties in getting the answer. Firstly, you’ve got to know where to find the data and secondly, what do with it when you get it. Currently this can be challenging without specialist knowledge, making it inaccessible for the general population. In the coming weeks, we’re going to write some blogs looking at the freely EO data available, and the easiest way of viewing it. Hopefully, this may to help you answer your own questions. In the meantime if you have questions you want answered, get in touch, we’d be happy to help.

Controlling the Space Industry Narrative

The narrative of the satellite industry over the last week had all the components of a blockbuster novel or film: with new adventures beginning, dramatic challenges to overcome, redemption and an emotional end.

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Like lots of good stories, we start with those characters setting off on new adventures. Firstly, China launched its most powerful imaging satellite, Gaofen-2. It carries a High Resolution Optical Imager capable of providing images with a spatial resolution of 80cm in panchromatic mode and 3.2m in multispectral mode, and has a swath width of 48km. It is the second in series of seven Earth observation (EO) satellites, following Gaofen-1 launched in April 2013, which will provide environmental monitoring, disaster management support, urban planning and geographical mapping. The Long March 4B rocket launched Gaofen-2, redeeming itself following a failure last December causing the loss of the CBERS-3 EO satellite. The second significant launch was from the International Space Station on the 19th August, when the first pair from the twenty-eight constellation satellites of Flock 1B were launched; with further pairs sent on the 20th, 21st and 23rd. Flock 1B is part of three earth imaging nanosat constellations from Plant Labs, providing images with a spatial resolution of between 3 – 5m.

ESA’s Galileo satellites, Doresa and Milena, provided the drama by failing to reach their planned altitude of 29.9km, reaching an orbit of 26.9km; in addition, their inclination angle is 49.8 degrees to the equator, rather than 55 degrees. They were the fifth and sixth satellites in Europe’s version of the American GPS satellite navigation system, launched on the Soyuz rocket. Getting the satellites to the correct position is likely to require more fuel than they carry. Like Long March 4B, Soyuz will get its chance of redemption in December with the launch of the next two Galileo satellites.

The Tropical Rainfall Measuring Mission (TRMM), a joint mission between NASA and Japan Aerospace, provides the emotional end to our story with the announcement last week that it had run out of fuel. Launched in 1997, TRMM had a three year life expectancy, but will now provide an incredible nineteen years worth of data. It will continue collection until early 2016, when its instruments will be turned off in preparation for re-entry.

It’s interesting to see how this news has been reported in the mainstream media, little mention of China’s progress, or the second Flock constellation or the amazing longevity of TRMM; instead, the focus was the failure of the Galileo satellites. There is rarely widespread coverage of the successful launches of satellites, but there is a push within the UK for the community to celebrate our successes more so the full range of space activities can be seen.

Earth observation is all about data and images, and whilst these may interest people, it’s only through the power of storytelling that we can describe the positives of the industry motivating and inspiring people. Remember to create stories for your industry, and your company, or someone else will dictate the narrative.

Why understanding spatial resolution is important?

Spatial resolution is a key characteristic in remote sensing, where it’s often used to refer to the size of pixels within an acquired image. However this is a simplification as the detector in the satellite doesn’t see the square suggested by a pixel, but rather it sees an ellipse due to the angle through which the detector receives the signal – known as the instantaneous field of view. The ellipses are turned into square pixels by data processing in creating the image.

The area of the port of Rotterdam shown using a Landsat image (background) at 30m resolution and MERIS full resolution image (inset image) at 300m resolution; data courtesy of the USGS and ESA. Example used within Hydrographic Academy eLearning material.

The area of the port of Rotterdam shown using a Landsat image (background) at 30m resolution and MERIS full resolution image (inset image) at 300m resolution; data courtesy of the USGS and ESA. Example used within Hydrographic Academy eLearning material.

Therefore, for example, when viewing an image with 1km resolution not only will you not be able to see anything that is smaller than 1km in size, but objects needs to be significantly larger than 1km for any detail to be discernable. Whilst this might be fine if you looking at changes in temperature across the Atlantic Ocean, it won’t be much use if you are interested in suspended sediment blooms at the mouth of a small river.

Any image with a spatial resolution of between 50m and 1km, is described as having low spatial resolution. For example, MODIS operates low spatial resolutions ranging from 250m to 1000m as the primary focus is global mapping rather than capturing detailed imagery for local regions.

If you want to look for smaller objects, you’ll need use images with medium spatial resolutions of between 4m to 50m. There is quite a lot of freely available imagery within this range. For example, NASA’s Landsat 8 operates at 15, 30m and 100m resolution and ESA’s Sentinel-1A operates at the three resolutions of 5m, 20m and 100m. If you want go even finer, you will require high spatial resolution images that go down to resolutions of between 4m and 1m, or very high spatial resolution images which cover the 0.5m – 1m range. Commercial organisations tend to operate satellites with these higher levels of resolution, and they charge for making the images available. It’s likely that military satellites offer imagery down to 0.15m, but there are regulations in place to prevent the sale of extremely high resolution imagery as it’s considered to be a potential danger to security.

Spatial resolution was in the headlines last week with launch of the DigitalGlobe’s WorldView-3 satellite that can produce spectral images with a resolution down to 0.31m. Technologies to produce images at this resolution have been around for some time, but as reported by Reuters in June, DigitialGlobe has only recently received a license from the US Commerce Department to start selling images with a resolution of up to 0.25m; without this licence they wouldn’t be able to sell this higher resolution imagery.

Regulatory involvement in very high resolution imagery was also demonstrated earlier this year, when in January, the UK government blocked the European Commission’s effort to set common European regulations on the sale of high-resolution satellite imagery. The UK government currently controls access to data through export licencing conditions on the satellite hardware, and they felt regulations would impact on UK’s ability to export space technology.

Therefore, spatial resolution is an important term, and one every remote sensing client should understand. Different services require different spatial resolutions, and selecting the most appropriate resolution for your needs will not only ensure that you get exactly what you want, but could also save you money as you don’t want to over-specify.