Blue Phase at Wavelength 2018

Blue John Cavern

Last week I attended the 2018 Wavelength Conference in Sheffield. This is an annual gathering for the Remote Sensing and Photogrammetry Society (RSPSoc) and is geared towards PhD students and early career scientists. The conference aim is to provide a welcoming and constructive atmosphere to present research and progress towards PhD’s, coupled with a vibrant social programme.

This was my first experience of a remote sensing conference and the cosy nature of the common room where it was held alongside the lack of pressure of a larger event lent itself well to its ambition.

The topics covered by the research varied greatly, each with a focus on how to apply remote sensing and photogrammetry techniques in novel ways to better understand the world around us. These ranged from tracking whales to monitoring rice fields and developing systems to track small scale landslides.

One key technology which was popular among the presentations was the application of machine learning, the training of an artificial intelligence (AI) to classify images for a variety of purposes. Given it is something I’m becoming involved in at Pixalytics, every mention of AI attracted my attention. One presentation which stuck out for me was its application to track the effects of crude oil pollution in the Niger delta region. Harnessing remote sensing data and utilising the power of machine learning to sift through hundreds or even thousands of images, classify details and pick out objects of interest to monitor environmental damage is a novel approach. It provides a direct link from the science to a serious real-world issue. Whilst a localised case, the techniques demonstrated have the potential to better inform our responses to these issues which in turn will help people being affected by these disasters.

This application of science combined with the potential to one day help people resonated with me greatly. It reminded me of the work I am currently doing on the Drought and Flood Mitigation Service project which will aid the lives of Ugandan farmers.

Two keynotes were delivered during the conference, one by Dr. Alistair Graham, from Geoger Ltd, and one from the Chairman of RSPSoc Dr. Richard Armitage. Dr. Graham’s keynote was fascinating as he delivered his experiences working in a multitude of different environments from corporate to SME’s in industry to post doc positions in academia. He explained the nuances of working in each area and the possible paths for career progression open to PhD students and other early career scientists. I fall into the latter category, but the perspective he provided convinced me to keep my options open for the future. At a time when industry and academia is changing rapidly anything could happen.

Dr. Armitage’s keynote was on responsive remote sensing and his talk focused on how to use the right remote sensing data at the right time and for the right area. For the problems we come across, identifying the correct approach to take with remote sensing data is crucial.

For example, two important factors to consider for any problem are spatial resolution and data type. Some features require 5m to be visible, whereas for others the 30m resolution can show what is required. Further to consider is what type of data is best suited for the problem, optical data has its advantages but infra-red can reveal insights that optical data cannot. Having come across these points before the keynote, it served as a good reinforcement on the topic.

Blue John in the rock.

The highlight of the conference for me was the tour around Blue John cavern. Tucked away in the Peak District, surrounded by stunning views of the hills, the cavern is home to the famous Blue John stone. The tour guide was a miner who had worked in the cavern for 15 years and his knowledge on the tour was remarkable, making every stop ever more interesting.

Whilst a lot of walking and climbing was done, the colourful Blue John that spotted the walls of the cavern, together with the extremely high ceilings carved out by long gone rivers made for amazing views. If you don’t mind cramped spaces and traversing up and down a large mine, then Blue John cavern is a fantastic place to go!

For my first conference experience Wavelength 2018 was a fantastic introduction. The welcoming atmosphere, getting to see the diverse nature of remote sensing and photogrammetry research going on right now and the insightful keynotes will stick with me for a long time. I highly recommend any early career scientist or PhD student to attend the next incarnation of this conference.

Chris Doyle
Junior Software Developer
Pixalytics Ltd

Have you read the top Pixalytics blogs of 2017?

World Cloud showing top 100 words from Pixalytics 2017 blogs

In our final blog of the year, we’re looking back at our most popular posts of the last twelve months. Have you read them all?

Of the top ten most read blogs, nine were actually written in previous years. These were:

You’ll notice that this list is dominated by our annual reviews of the number of satellites, and Earth observation satellites, orbiting the Earth. It often surprises us to see where these blogs are quoted and we’ve been included in articles on websites for Time Magazine, Fortune Magazine and the New Statesman to name a few!

So despite only being published in November this year coming in as the fourth most popular blog of the year was, unsurprisingly:

For posts published in 2017, the other nine most popular were:

2017 has been a really successful one for our website. The number of the views for the year is up by 75%, whilst the number of unique visitors has increased by 92%!

Whilst hard work, we do enjoy writing our weekly blog – although staring at a blank screen on a Wednesday morning without any idea of what we’ll publish a few hours later can be daunting!

We’re always delighted at meetings and conferences when people come up and say they read the blog. It’s nice to know that we’re read both within our community, as well as making a small contribution to informing and educating people outside the industry.

Thanks for reading this year, and we hope we can catch your eye again next year.

We’d like to wish everyone a Happy New Year, and a very successful 2018!

3 Ways Earth Observation is Tackling Food Security

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

One of the key global challenges is food security. A number of reports issued last week, coinciding with World Food Day on the 16th October, demonstrated how Earth Observation (EO) could play a key part in tackling this.

Climate change is a key threat to food security. The implications were highlighted by the U.S. Geological Survey (USGS) report who described potential changes to suitable farmland for rainfed crops. Rainfed farming accounts for approximately 75 percent of global croplands, and it’s predicated that these locations will change in the coming years. Increased farmland will be available in North America, western Asia, eastern Asia and South America, whilst there will be a decline in Europe and the southern Great Plains of the US.

The work undertaken by USGS focussed on looking at the impact of temperature extremes and the associated changes in seasonality of soil moisture conditions. The author of the study, John Bradford said “Our results indicate the interaction of soil moisture and temperature extremes provides a powerful yet simple framework for understanding the conditions that define suitability for rainfed agriculture in drylands.” Soil moisture is a product that Pixalytics is currently working on, and its intriguing to see that this measurement could be used to monitor climate change.

Given that this issue may require farmers to change crops, work by India’s Union Ministry of Agriculture to use remote sensing data to identify areas best suited for growing different crops is interesting. The Coordinated Horticulture Assessment and Management using geoinformatics (CHAMAN) project has used data collected by satellites, including the Cartosat Series and RESOURCESAT-1, to map 185 districts in relation to the best conditions for growing bananas, mangos, citrus fruits, potatoes, onions, tomatoes and chilli peppers.

The results for eight states in the north east of the country will be presented in January, with the remainder a few months later, identifying the best crop for each district. Given that India is already the second largest producer of fruit and vegetables in the world, this is a fascinating strategic development to their agriculture industry.

The third report was the announcement of a project between the University of Queensland and the Chinese Academy of Sciences which hopes to improve the accuracy of crop yield predictions. EO data with an improved spatial, and temporal, resolution is being used alongside biophysical information to try to predict crop yield at a field scale in advance of the harvest. It is hoped that this project will produce an operational product through this holistic approach.

These are some examples of the way in which EO data is changing the way we look at agriculture, and potential help provide improved global food security in the future.

Can You See The Great Wall of China From Space?

Area north of Beijing, China, showing the Great Wall of China running through the centre. Image acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

Dating back over two thousand three hundred years, the Great Wall of China winds its way from east to west across the northern part of the country. The current remains were built during Ming Dynasty and have a length of 8 851.8 km according to 2009 work by the Chinese State Administration of Cultural Heritage and National Bureau of Surveying and Mapping Agency. However, if you take into account the different parts of the wall built by other dynasties, its length is almost twenty two thousand kilometres.

The average height of the wall is between six and seven metres, and its width is between four to five metres. This width would allow five horses, or ten men, to walk side by side. The sheer size of the structure has led people to believe that it could be seen from space. This was first described by William Stukeley in 1754, when he wrote in reference to Hadrian’s Wall that ‘This mighty wall of four score miles in length is only exceeded by the Chinese Wall, which makes a considerable figure upon the terrestrial globe, and may be discerned at the Moon.’

Despite Stukeley’s personal opinion not having any scientific basis, it has been repeated many times since. By the time humans began to go into space, it was considered a fact. Unfortunately, astronauts such as Buzz Aldrin, Chris Hatfield and even China’s first astronaut, Yang Liwei, have all confirmed that the Great Wall is not visible from space by the naked eye. Even Pixalytics has got a little involved in this debate. Two years ago we wrote a blog saying that we couldn’t see the wall on Landsat imagery as the spatial resolution was not small enough to be able to distinguish it from its surroundings.

Anyone who is familiar with the QI television series on the BBC will know that they occasionally ask the same question in different shows and give different answers when new information comes to light. This time it’s our turn!

Last week Sam was a speaker at the TEDx One Step Beyond event at the National Space Centre in Leicester – you’ll hear more of that in a week or two. However, in exploring some imagery for the event we looked for the Great Wall of China within Sentinel-2 imagery. And guess what? We found it! In the image at the top, the Great Wall can be seen cutting down the centre from the top left.

Screenshot of SNAP showing area north of Beijing, China. Data acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

It was difficult to spot. The first challenge was getting a cloud free image of northern China, and we only found one covering our area of interest north of Beijing! Despite Sentinel-2 having 10 m spatial resolution for its visible wavelengths, as noted above, the wall is generally narrower. This means it is difficult to see the actual wall itself, but it is possible to see its path on the image. This ability to see very small things from space by their influence on their surroundings is similar to how we are able to spot microscopic phytoplankton blooms. The image on the right is a screenshot from Sentinel Application Platform tool (SNAP) which shows the original Sentinel-2 image of China on the top left and the zoomed section identifying the wall.

So whilst the Great Wall of China might not be visible from space with the naked eye, it is visible from our artificial eyes in the skies, like Sentinel-2.

Two New Earth Observation Satellites Launched

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Two new Earth observation satellites were launched last week from European Space Centre in Kourou in French Guyana, although you may only get to see the data from one. Venµs and OPTSAT-3000 were put into sun synchronous orbits by Arianespace via its Vega launch vehicle on the 1st August. Both satellites were built by Israel’s state-owned Israel Aerospace Industries and carry instruments from Israel’s Elbit Systems.

Venµs, or to give its full title of Vegetation and Environment monitoring on a New MicroSatellite, is a joint scientific collaboration between the Israeli Space Agency (ISA) and France’s CNES space agency.

Venµs is focussed on environmental monitoring including climate, soil and topography. Its aim is to help improve the techniques and accuracy of global models, with a particular emphasis on understanding how environmental and human factors influence plant health. The satellite is equipped with the VENµS Superspectral Camera (VSSC) that uses 12 narrow spectral bands in the Visible Near Infrared (VNIR) spectrum – ranging from 420nm wavelength up to 910 nm wavelength – to capture 12 simultaneous overlapping high resolution images which are then combined into a single image. The camera uses a pushbroom collection technique and has a spatial resolution of 5.3m and a swath size of 27.56 km.

Venµs won’t have full global coverage; instead there are 110 areas of interest around the world that includes forests, croplands and nature reserves. With a two day revisit time, during which time it completes 29 orbits of the planet. This means every thirtieth image will be collected over the same place, at the same time and with the same angle. This will provide high resolution imagery more frequently than is currently available from existing EO satellites. The consistency of the place, time and angle will help researchers better assess fine-scale changes on the land to improve our understanding of the:

  • State of the soil,
  • vegetation growth,
  • detection of spreading disease or contamination,
  • snow cover and glacial movements; and
  • sediment movement in coastal estuaries

A specific software algorithm has been developed for the mission to work with the different wavelengths to remove clouds and aerosols from the satellite’s imagery, giving clear images of the planet irrespective of atmospheric conditions.

The second satellite launched was the OPTSAT-3000 which is an Italian controlled optical surveillance satellite, which will operate in conjunction with the COSMO-SkyMed radar satellites giving Italy’s Ministry of Defence independent autonomous national Earth observation capability across optical and radar imagery.

This is a military satellite and so some of the details are difficult to verify. As mentioned earlier the instrument was made by Elbit systems, and the camera used usually offers a spatial resolution of around 0.5 m. However, it has been reported that the resolution will be much closer to 0.3m because the satellite is in a very low earth orbit of a 450 km.

OPTSAT-3000 will collect high resolution imaging of the Earth, it’s not clear at this stage whether any of the imagery will be made available for commercial/scientific use or purchase, although it is worth noting that COSMOS-SkyMed images are sold.

Two more Earth observation satellites launched shows that our industry keeps on moving forward! We’re really interested, and in OPTSAT’s case hopeful, to see the imagery they produce.

Locusts & Monkeys

Soil moisture data from the SMOS satellite and the MODIS instrument acquired between July and October 2016 were used by isardSAT and CIRAD to create this map showing areas with favourable locust swarming conditions (in red) during the November 2016 outbreak. Data courtesy of ESA. Copyright : CIRAD, SMELLS consortium.

Spatial resolution is a key characteristic in remote sensing, as we’ve previously discussed. Often the view is that you need an object to be significantly larger than the resolution to be able to see it on an image. However, this is not always the case as often satellites can identify indicators of objects that are much smaller.

We’ve previously written about satellites identifying phytoplankton in algal blooms, and recently two interesting reports have described how satellites are being used to determine the presence of locusts and monkeys!


Desert locusts are a type of grasshopper, and whilst individually they are harmless as a swarm they can cause huge damage to populations in their paths. Between 2003 and 2005 a swarm in West Africa affected eight million people, with reported losses of 100% for cereals, 90% for legumes and 85% for pasture.

Swarms occur when certain conditions are present; namely a drought, followed by rain and vegetation growth. ESA and the UN Food and Agriculture Organization (FAO) have being working together to determine if data from the Soil Moisture and Ocean Salinity (SMOS) satellite can be used to forecast these conditions. SMOS carries a Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) instrument – a 2D interferometric L-band radiometer with 69 antenna receivers distributed on a Y-shaped deployable antenna array. It observes the ‘brightness temperature’ of the Earth, which indicates the radiation emitted from planet’s surface. It has a temporal resolution of three days and a spatial resolution of around 50 km.

By combining the SMOS soil moisture observations with data from NASA’s MODIS instrument, the team were able to downscale SMOS to 1km spatial resolution and then use this data to create maps. This approach then predicted favourable locust swarming conditions approximately 70 days ahead of the November 2016 outbreak in Mauritania, giving the potential for an early warning system.

This is interesting for us as we’re currently using soil moisture data in a project to provide an early warning system for droughts and floods.


Earlier this month the paper, ‘Connecting Earth Observation to High-Throughput Biodiversity Data’, was published in the journal Nature Ecology and Evolution. It describes the work of scientists from the Universities of Leicester and East Anglia who have used satellite data to help identify monkey populations that have declined through hunting.

The team have used a variety of technologies and techniques to pull together indicators of monkey distribution, including:

  • Earth observation data to map roads and human settlements.
  • Automated recordings of animal sounds to determine what species are in the area.
  • Mosquitos have been caught and analysed to determine what they have been feeding on.

Combining these various datasets provides a huge amount of information, and can be used to identify areas where monkey populations are vulnerable.

These projects demonstrate an interesting capability of satellites, which is not always recognised and understood. By using satellites to monitor certain aspects of the planet, the data can be used to infer things happening on a much smaller scale than individual pixels.

Have you read the top Pixalytics blogs of 2016?

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

As this is the final blog of the year we’d like to take a look back over the past fifty-two weeks and see which blog’s captured people’s attention, and conversely which did not!

It turns out that seven of the ten most widely viewed blogs of the last year weren’t even written in 2016. Four were written in 2015, and three were written in 2014! The other obvious trend is the interest in the number of satellites in space, which can be seen by the titles of six of the ten most widely read blogs:

We’ve also found these blogs quoted by a variety of other web pages, and the occasional report. It’s always interesting to see where we’re quoted!

The other most read blogs of the year were:

Whilst only three of 2016’s blogs made our top ten, this is partly understandable as they have less time to attract the interest of readers and Google. However, looking at most read blogs of 2016 shows an interest in the growth of the Earth Observation market, Brexit, different types of data and Playboy!

We’ve now completed three years of weekly blogs, and the views on our website have grown steadily. This year has seen a significant increase in viewed pages, which is something we’re delighted to see.

We like our blog to be of interest to our colleagues in remote sensing and Earth observation, although we also touch on issues of interest to the wide space, and small business, communities.

At Pixalytics we believe strongly in education and training in both science and remote sensing, together with supporting early career scientists. As such we have a number of students and scientists working with us during the year, and we always like them to write a blog. Something they’re not always keen on at the start! This year we’ve had pieces on:

Writing a blog each week can be hard work, as Wednesday mornings always seem to come around very quickly. However, we think this work adds value to our business and makes a small contribution to explaining the industry in which we work.

Thanks for reading this year, and we hope we can catch your interest again next year.

We’d like to wish everyone a Happy New Year, and a very successful 2017!

Rio Olympics from space

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

The Opening Ceremony of the 2016 Summer Olympics takes place on Friday and so we’ve decided to revive our highly infrequent blog series ‘Can you see sporting venues from space?’ Previously we’ve looked for the Singapore and Abu Dhabi Formula One Grand Prix Circuits, but this week we’re focussing on the Rio Olympic venues.

Rio de Janeiro
The Games of the XXXI Olympiad will take place from the 5th to the 21st August in the Brazilian city of Rio de Janeiro. It is expected that more than ten thousand athletes will be competing for the 306 Olympic titles across 37 venues, 7 of which are temporary venues and 5 are outside Rio. The remaining twenty-five are permanent venues within the city, and 11 have been newly built for the Olympics and Paralympics. It is these permanent venues that we’ll see if we can spot from space!

The image at the top of the blog shows the Rio area, and you’ll notice the dark green area in the centre of the image which is the Tijuca National Park containing one of the world’s largest urban rainforest. It covers an area of 32 km².

Spatial Resolution
Spatial resolution is the key characteristic in whether sporting venues can be seen from space, and in simplistic terms it refers to the smallest object that can be seen on Earth from that sensor. For example, an instrument with a 10 m spatial resolution means that each pixel on its image represents 10 m, and therefore for something to be distinguishable on that image it needs to be larger than 10 m in size. There are exceptions to this rule, such as gas flares, which are so bright that they can dominate a much larger pixel.

We used the phrase ‘simplistic terms’ above because technically, the sensor in the satellite doesn’t actually see a square pixel, instead it sees an ellipse due to the angle through which it receives the signal. The ellipses are turned into square pixels by data processing to create the image. Spatial resolution is generally considered to have four categories:

  • Low spatial resolution: tend to have pixels between 50 m and 1 km.
  • Medium spatial resolution: tend to have pixels between 4 m and 50 m.
  • High spatial resolution: tend to have pixels between 1 m and 4 m.
  • Very high spatial resolution: tend to have pixels between 0.25 m to 1 m

Clearly with very high resolution imagery, such as that provided by commercial Worldview satellites owned by DigitalGlobe, can provide great images of the Olympic venues. However, as you know we like to work with data that is free-to-access, rather than paid for data. We’ve used Sentinel-2 data for this blog, which has a 10 m spatial resolution for its visible and near infra-red bands via the multispectral imager it carries.

Can we see the Olympic venues from space?
In our earlier parts of this infrequent series we couldn’t see the night race from the Singapore circuit, but we did identify the Abu Dhabi track and red roof of the Ferrari World theme park. So can we see the Olympics? Actually we can!

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

On the image to left, you’ll notice two bright white circles, one in the middle of the image and the second to the south-east. The bright circle in the middle is the Olympic Stadium which will be hosting the athletics and stands out clearly from the buildings surrounding it, to the South East is the Maracanã Stadium which will stage the opening and closing ceremonies together with the finals of the football tournaments.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the bottom left of the image is small triangular shape which is location for the Aquatics Stadium, Olympic Tennis Centre, the Gymnastic and Wheelchair basketball arena, and the Carioca arenas which will host basketball, judo, wrestling and boccia. The bottom of the triangle juts out into the Jacarepagua Lagoon.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the top left of the image, you can see the runway of the military Afonsos Air Force Base and north of the air base are a number of other Olympic venues, however these are hard to spot within their surroundings – these include the Equestrian Centre, Hockey Centre, BMX Centre, Whitewater canoe slalom course and the Deodoro stadium which will host the Rugby 7s and modern pentathlon.

It is possible to see the Olympic venues from space! Good luck to all the athletics competing over the next few weeks.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!

Temporal: The forgotten resolution

Time, Copyright: scanrail / 123RF Stock Photo

Time, Copyright: scanrail / 123RF Stock Photo

Temporal resolution shouldn’t be forgotten when considering satellite imagery; however it’s often neglected, with its partners of spatial and spectral resolution getting the limelight. The reason is the special relationship spatial and spectral has, where a higher spectral resolution has meant a lower spatial resolution and vice-versa, because of limited satellite disk space and transmission capabilities. Therefore, when considering imagery most people focus on their spatial or spectral needs and go with whatever best suits their needs, rarely giving temporal resolution a second thought, other than if immediate data acquisition is required.

Temporal resolution is the amount of time it takes a satellite to return to collect data for exactly the same location on Earth, also known as the revisit or recycle time, expressed as a function of time in hours or days. Global coverage satellites tend to have low earth polar, or near-polar, orbits travelling at around 27,000kph and taking around 100 minutes to circle the Earth. With each orbit the Earth rotates twenty-five degrees around its polar axis, and so on each successive orbit the ground track moves to the west, meaning it takes a couple of weeks to fully rotate, for example, Landsat has a 16 day absolute revisit time.

Only seeing the part of the Earth you want to image once every few weeks, isn’t very helpful if you want to see daily changes. Therefore, there are a number of techniques satellites use to improve the temporal resolution:

  • Swath Width– A swath is the area of ground the satellite sees with each orbit, the wider the swath the greater the ground coverage, but generally a wider swath means lower spatial resolution. A satellite with a wide swath will have significant overlaps between orbits that allows areas of the Earth to be imaged more frequently, reducing the revisit time. MODIS uses a wide swath and it images the globe every one to two days.
  • Constellations – If you have two identical satellites orbiting one hundred and eighty degrees apart you will reduce revisit times, and this approach is being used by ESA’s Sentinel missions. Sentinel-1A was launched in 2014, with its twin Sentinel-1B is due to be launched in 2016. When operating together they will provide a temporal resolution of six days. Obviously, adding more satellites to the constellations will continue to reduce the revisit time.
  • Pointing – High-resolution satellites in particular use this method, which allows the satellites to point their sensors at a particular point on earth, and so can map the same area from multiple orbits. However, pointing changes the angle the sensor looks at the Earth, and means the ground area it can observe can be distorted.
  • Geostationary Orbits – Although technically not the same, a geostationary satellite remains focussed on an area of the Earth at all times and so the temporal resolution is the number of times imagery is taken, for example, every fifteen minutes. The problem is that you can only map a restricted area.

Hopefully, this has given you a little oversight on temporal resolution, and whilst spectral and spatial resolution are important factors when considering what imagery you need; do spent a bit a time considering temporal needs too!