Two New Earth Observation Satellites Launched

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Two new Earth observation satellites were launched last week from European Space Centre in Kourou in French Guyana, although you may only get to see the data from one. Venµs and OPTSAT-3000 were put into sun synchronous orbits by Arianespace via its Vega launch vehicle on the 1st August. Both satellites were built by Israel’s state-owned Israel Aerospace Industries and carry instruments from Israel’s Elbit Systems.

Venµs, or to give its full title of Vegetation and Environment monitoring on a New MicroSatellite, is a joint scientific collaboration between the Israeli Space Agency (ISA) and France’s CNES space agency.

Venµs is focussed on environmental monitoring including climate, soil and topography. Its aim is to help improve the techniques and accuracy of global models, with a particular emphasis on understanding how environmental and human factors influence plant health. The satellite is equipped with the VENµS Superspectral Camera (VSSC) that uses 12 narrow spectral bands in the Visible Near Infrared (VNIR) spectrum – ranging from 420nm wavelength up to 910 nm wavelength – to capture 12 simultaneous overlapping high resolution images which are then combined into a single image. The camera uses a pushbroom collection technique and has a spatial resolution of 5.3m and a swath size of 27.56 km.

Venµs won’t have full global coverage; instead there are 110 areas of interest around the world that includes forests, croplands and nature reserves. With a two day revisit time, during which time it completes 29 orbits of the planet. This means every thirtieth image will be collected over the same place, at the same time and with the same angle. This will provide high resolution imagery more frequently than is currently available from existing EO satellites. The consistency of the place, time and angle will help researchers better assess fine-scale changes on the land to improve our understanding of the:

  • State of the soil,
  • vegetation growth,
  • detection of spreading disease or contamination,
  • snow cover and glacial movements; and
  • sediment movement in coastal estuaries

A specific software algorithm has been developed for the mission to work with the different wavelengths to remove clouds and aerosols from the satellite’s imagery, giving clear images of the planet irrespective of atmospheric conditions.

The second satellite launched was the OPTSAT-3000 which is an Italian controlled optical surveillance satellite, which will operate in conjunction with the COSMO-SkyMed radar satellites giving Italy’s Ministry of Defence independent autonomous national Earth observation capability across optical and radar imagery.

This is a military satellite and so some of the details are difficult to verify. As mentioned earlier the instrument was made by Elbit systems, and the camera used usually offers a spatial resolution of around 0.5 m. However, it has been reported that the resolution will be much closer to 0.3m because the satellite is in a very low earth orbit of a 450 km.

OPTSAT-3000 will collect high resolution imaging of the Earth, it’s not clear at this stage whether any of the imagery will be made available for commercial/scientific use or purchase, although it is worth noting that COSMOS-SkyMed images are sold.

Two more Earth observation satellites launched shows that our industry keeps on moving forward! We’re really interested, and in OPTSAT’s case hopeful, to see the imagery they produce.

Locusts & Monkeys

Soil moisture data from the SMOS satellite and the MODIS instrument acquired between July and October 2016 were used by isardSAT and CIRAD to create this map showing areas with favourable locust swarming conditions (in red) during the November 2016 outbreak. Data courtesy of ESA. Copyright : CIRAD, SMELLS consortium.

Spatial resolution is a key characteristic in remote sensing, as we’ve previously discussed. Often the view is that you need an object to be significantly larger than the resolution to be able to see it on an image. However, this is not always the case as often satellites can identify indicators of objects that are much smaller.

We’ve previously written about satellites identifying phytoplankton in algal blooms, and recently two interesting reports have described how satellites are being used to determine the presence of locusts and monkeys!


Desert locusts are a type of grasshopper, and whilst individually they are harmless as a swarm they can cause huge damage to populations in their paths. Between 2003 and 2005 a swarm in West Africa affected eight million people, with reported losses of 100% for cereals, 90% for legumes and 85% for pasture.

Swarms occur when certain conditions are present; namely a drought, followed by rain and vegetation growth. ESA and the UN Food and Agriculture Organization (FAO) have being working together to determine if data from the Soil Moisture and Ocean Salinity (SMOS) satellite can be used to forecast these conditions. SMOS carries a Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) instrument – a 2D interferometric L-band radiometer with 69 antenna receivers distributed on a Y-shaped deployable antenna array. It observes the ‘brightness temperature’ of the Earth, which indicates the radiation emitted from planet’s surface. It has a temporal resolution of three days and a spatial resolution of around 50 km.

By combining the SMOS soil moisture observations with data from NASA’s MODIS instrument, the team were able to downscale SMOS to 1km spatial resolution and then use this data to create maps. This approach then predicted favourable locust swarming conditions approximately 70 days ahead of the November 2016 outbreak in Mauritania, giving the potential for an early warning system.

This is interesting for us as we’re currently using soil moisture data in a project to provide an early warning system for droughts and floods.


Earlier this month the paper, ‘Connecting Earth Observation to High-Throughput Biodiversity Data’, was published in the journal Nature Ecology and Evolution. It describes the work of scientists from the Universities of Leicester and East Anglia who have used satellite data to help identify monkey populations that have declined through hunting.

The team have used a variety of technologies and techniques to pull together indicators of monkey distribution, including:

  • Earth observation data to map roads and human settlements.
  • Automated recordings of animal sounds to determine what species are in the area.
  • Mosquitos have been caught and analysed to determine what they have been feeding on.

Combining these various datasets provides a huge amount of information, and can be used to identify areas where monkey populations are vulnerable.

These projects demonstrate an interesting capability of satellites, which is not always recognised and understood. By using satellites to monitor certain aspects of the planet, the data can be used to infer things happening on a much smaller scale than individual pixels.

Have you read the top Pixalytics blogs of 2016?

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

As this is the final blog of the year we’d like to take a look back over the past fifty-two weeks and see which blog’s captured people’s attention, and conversely which did not!

It turns out that seven of the ten most widely viewed blogs of the last year weren’t even written in 2016. Four were written in 2015, and three were written in 2014! The other obvious trend is the interest in the number of satellites in space, which can be seen by the titles of six of the ten most widely read blogs:

We’ve also found these blogs quoted by a variety of other web pages, and the occasional report. It’s always interesting to see where we’re quoted!

The other most read blogs of the year were:

Whilst only three of 2016’s blogs made our top ten, this is partly understandable as they have less time to attract the interest of readers and Google. However, looking at most read blogs of 2016 shows an interest in the growth of the Earth Observation market, Brexit, different types of data and Playboy!

We’ve now completed three years of weekly blogs, and the views on our website have grown steadily. This year has seen a significant increase in viewed pages, which is something we’re delighted to see.

We like our blog to be of interest to our colleagues in remote sensing and Earth observation, although we also touch on issues of interest to the wide space, and small business, communities.

At Pixalytics we believe strongly in education and training in both science and remote sensing, together with supporting early career scientists. As such we have a number of students and scientists working with us during the year, and we always like them to write a blog. Something they’re not always keen on at the start! This year we’ve had pieces on:

Writing a blog each week can be hard work, as Wednesday mornings always seem to come around very quickly. However, we think this work adds value to our business and makes a small contribution to explaining the industry in which we work.

Thanks for reading this year, and we hope we can catch your interest again next year.

We’d like to wish everyone a Happy New Year, and a very successful 2017!

Rio Olympics from space

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

The Opening Ceremony of the 2016 Summer Olympics takes place on Friday and so we’ve decided to revive our highly infrequent blog series ‘Can you see sporting venues from space?’ Previously we’ve looked for the Singapore and Abu Dhabi Formula One Grand Prix Circuits, but this week we’re focussing on the Rio Olympic venues.

Rio de Janeiro
The Games of the XXXI Olympiad will take place from the 5th to the 21st August in the Brazilian city of Rio de Janeiro. It is expected that more than ten thousand athletes will be competing for the 306 Olympic titles across 37 venues, 7 of which are temporary venues and 5 are outside Rio. The remaining twenty-five are permanent venues within the city, and 11 have been newly built for the Olympics and Paralympics. It is these permanent venues that we’ll see if we can spot from space!

The image at the top of the blog shows the Rio area, and you’ll notice the dark green area in the centre of the image which is the Tijuca National Park containing one of the world’s largest urban rainforest. It covers an area of 32 km².

Spatial Resolution
Spatial resolution is the key characteristic in whether sporting venues can be seen from space, and in simplistic terms it refers to the smallest object that can be seen on Earth from that sensor. For example, an instrument with a 10 m spatial resolution means that each pixel on its image represents 10 m, and therefore for something to be distinguishable on that image it needs to be larger than 10 m in size. There are exceptions to this rule, such as gas flares, which are so bright that they can dominate a much larger pixel.

We used the phrase ‘simplistic terms’ above because technically, the sensor in the satellite doesn’t actually see a square pixel, instead it sees an ellipse due to the angle through which it receives the signal. The ellipses are turned into square pixels by data processing to create the image. Spatial resolution is generally considered to have four categories:

  • Low spatial resolution: tend to have pixels between 50 m and 1 km.
  • Medium spatial resolution: tend to have pixels between 4 m and 50 m.
  • High spatial resolution: tend to have pixels between 1 m and 4 m.
  • Very high spatial resolution: tend to have pixels between 0.25 m to 1 m

Clearly with very high resolution imagery, such as that provided by commercial Worldview satellites owned by DigitalGlobe, can provide great images of the Olympic venues. However, as you know we like to work with data that is free-to-access, rather than paid for data. We’ve used Sentinel-2 data for this blog, which has a 10 m spatial resolution for its visible and near infra-red bands via the multispectral imager it carries.

Can we see the Olympic venues from space?
In our earlier parts of this infrequent series we couldn’t see the night race from the Singapore circuit, but we did identify the Abu Dhabi track and red roof of the Ferrari World theme park. So can we see the Olympics? Actually we can!

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

On the image to left, you’ll notice two bright white circles, one in the middle of the image and the second to the south-east. The bright circle in the middle is the Olympic Stadium which will be hosting the athletics and stands out clearly from the buildings surrounding it, to the South East is the Maracanã Stadium which will stage the opening and closing ceremonies together with the finals of the football tournaments.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the bottom left of the image is small triangular shape which is location for the Aquatics Stadium, Olympic Tennis Centre, the Gymnastic and Wheelchair basketball arena, and the Carioca arenas which will host basketball, judo, wrestling and boccia. The bottom of the triangle juts out into the Jacarepagua Lagoon.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the top left of the image, you can see the runway of the military Afonsos Air Force Base and north of the air base are a number of other Olympic venues, however these are hard to spot within their surroundings – these include the Equestrian Centre, Hockey Centre, BMX Centre, Whitewater canoe slalom course and the Deodoro stadium which will host the Rugby 7s and modern pentathlon.

It is possible to see the Olympic venues from space! Good luck to all the athletics competing over the next few weeks.

The cost of ‘free data’

False Colour Composite of the Black Rock Desert, Nevada, USA.  Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

False Colour Composite of the Black Rock Desert, Nevada, USA. Image acquired on 6th April 2016. Data courtesy of NASA/JPL-Caltech, from the Aster Volcano Archive (AVA).

Last week, the US and Japan announced free public access to the archive of nearly 3 million images taken by ASTER instrument; previously this data had only been accessible with a nominal fee.

ASTER, Advanced Spaceborne Thermal Emission and Reflection Radiometer, is a joint Japan-US instrument aboard NASA’s Terra satellite with the data used to create detailed maps of land surface temperature, reflectance, and elevation. When NASA made the Landsat archive freely available in 2008, an explosion in usage occurred. Will the same happen to ASTER?

As a remote sensing advocate I want many more people to be using satellite data, and I support any initiative that contributes to this goal. Public satellite data archives such as Landsat, are often referred to as ‘free data’. This phrase is unhelpful, and I prefer the term ‘free to access’. This is because ‘free data’ isn’t free, as someone has already paid to get the satellites into orbit, download the data from the instruments and then provide the websites for making this data available. So, who has paid for it? To be honest, it’s you and me!

To be accurate, these missions are generally funded by the tax payers of the country who put the satellite up. For example:

  • ASTER was funded by the American and Japanese public
  • Landsat is funded by the American public
  • The Sentinel satellites, under the Copernicus missions, are funded by the European public.

In addition to making basic data available, missions often also create a series of products derived from the raw data. This is achieved either by commercial companies being paid grants to create these products, which can then be offered as free to access datasets, or alternatively the companies develop the products themselves and then charge users to access to them.

‘Free data’ also creates user expectations, which may be unrealistic. Whenever a potential client comes to us, there is always a discussion on which data source to use. Pixalytics is a data independent company, and we suggest the best data to suit the client’s needs. However, this isn’t always the free to access datasets! There are a number of physical and operating criteria that need to be considered:

  • Spectral wavebands / frequency bands – wavelengths for optical instruments and frequencies for radar instruments, which determine what can be detected.
  • Spatial resolution: the size of the smallest objects that can be ‘seen’.
  • Revisit times: how often are you likely to get a new image – important if you’re interested in several acquisitions that are close together.
  • Long term archives of data: very useful if you want to look back in time.
  • Availability, for example, delivery schedule and ordering requirement.

We don’t want any client to pay for something they don’t need, but sometimes commercial data is the best solution. As the cost of this data can range from a few hundred to thousand pounds, this can be a challenging conversation with all the promotion of ‘free data’.

So, what’s the summary here?

If you’re analysing large amounts of data, e.g. for a time-series or large geographical areas, then free to access public data is a good choice as buying hundreds of images would often get very expensive and the higher spatial resolution isn’t always needed. However, if you want a specific acquisition over a specific location at high spatial resolution then the commercial missions come into their own.

Just remember, no satellite data is truly free!

Temporal: The forgotten resolution

Time, Copyright: scanrail / 123RF Stock Photo

Time, Copyright: scanrail / 123RF Stock Photo

Temporal resolution shouldn’t be forgotten when considering satellite imagery; however it’s often neglected, with its partners of spatial and spectral resolution getting the limelight. The reason is the special relationship spatial and spectral has, where a higher spectral resolution has meant a lower spatial resolution and vice-versa, because of limited satellite disk space and transmission capabilities. Therefore, when considering imagery most people focus on their spatial or spectral needs and go with whatever best suits their needs, rarely giving temporal resolution a second thought, other than if immediate data acquisition is required.

Temporal resolution is the amount of time it takes a satellite to return to collect data for exactly the same location on Earth, also known as the revisit or recycle time, expressed as a function of time in hours or days. Global coverage satellites tend to have low earth polar, or near-polar, orbits travelling at around 27,000kph and taking around 100 minutes to circle the Earth. With each orbit the Earth rotates twenty-five degrees around its polar axis, and so on each successive orbit the ground track moves to the west, meaning it takes a couple of weeks to fully rotate, for example, Landsat has a 16 day absolute revisit time.

Only seeing the part of the Earth you want to image once every few weeks, isn’t very helpful if you want to see daily changes. Therefore, there are a number of techniques satellites use to improve the temporal resolution:

  • Swath Width– A swath is the area of ground the satellite sees with each orbit, the wider the swath the greater the ground coverage, but generally a wider swath means lower spatial resolution. A satellite with a wide swath will have significant overlaps between orbits that allows areas of the Earth to be imaged more frequently, reducing the revisit time. MODIS uses a wide swath and it images the globe every one to two days.
  • Constellations – If you have two identical satellites orbiting one hundred and eighty degrees apart you will reduce revisit times, and this approach is being used by ESA’s Sentinel missions. Sentinel-1A was launched in 2014, with its twin Sentinel-1B is due to be launched in 2016. When operating together they will provide a temporal resolution of six days. Obviously, adding more satellites to the constellations will continue to reduce the revisit time.
  • Pointing – High-resolution satellites in particular use this method, which allows the satellites to point their sensors at a particular point on earth, and so can map the same area from multiple orbits. However, pointing changes the angle the sensor looks at the Earth, and means the ground area it can observe can be distorted.
  • Geostationary Orbits – Although technically not the same, a geostationary satellite remains focussed on an area of the Earth at all times and so the temporal resolution is the number of times imagery is taken, for example, every fifteen minutes. The problem is that you can only map a restricted area.

Hopefully, this has given you a little oversight on temporal resolution, and whilst spectral and spatial resolution are important factors when considering what imagery you need; do spent a bit a time considering temporal needs too!

SMAP ready to map!

Artist's rendering of the Soil Moisture Active Passive satellite.  Image credit: NASA/JPL-Caltech

Artist’s rendering of the Soil Moisture Active Passive satellite.
Image credit: NASA/JPL-Caltech

On the 31st January NASA launched their Soil Moisture Active Passive satellite, generally known by the more pronounceable acronym SMAP, aboard the Delta 2 rocket. It will go into a near polar sun-synchronous orbit at an altitude of 685km.

The SMAP mission will measure the amount of water in the top five centimetres of soil, and whether the ground is frozen or not. These two measurements will be combined to produce global maps of soil moisture to improve understanding of the water, carbon and energy cycles. This data will support applications ranging from weather forecasting, monitoring droughts, flood prediction and crop productivity, as well as providing valuable information to climate science.

The satellite carries two instruments; a passive L-Band radiometer and an active L-Band synthetic aperture radar (SAR). Once in space the satellite will deploy a spinning 6m gold-coated mesh antenna which will measure the backscatter of radar pulses, and the naturally occurring microwave emissions, from off the Earth’s surface. Rotating 14.6 times every minute, the antenna will provide overlapping loops of 1000km giving a wide measurement swath. This means that whilst the satellite itself only has an eight day repeat cycle, SMAP will take global measurements every two to three days.

Interestingly, although antennas have previously been used in large communication satellites, this will be the first time a deployable antenna, and the first time a spinning application, have been used for scientific measurement.

The radiometer has a high soil moisture measurement accuracy, but has a spatial resolution of only 40km; whereas the SAR instrument has much higher spatial resolution of 10km, but with lower soil moisture measurement sensitivity. Combining the passive and active observations will give measurements of soil moisture at 10km, and freeze/thaw ground state at 3km. Whilst SMAP is focussed on provided on mapping Earth’s non-water surface, it’s also anticipated to provide valuable data on ocean salinity.

SMAP will provide data about soil moisture content across the world, the variability of which is not currently well understood. However, it’s vital to understanding both the water and carbon cycles that impact our weather and climate.

Why counting animals from spaces isn’t as hard as you think

Great Migration in Maasai Mara National Park, Kenya

Great Migration in Maasai Mara National Park, Kenya; copyright alextara / 123RF Stock Photo

Last week the keepers at London Zoo were busy counting their 17,000 animals, as part of the annual headcount. Knowing numbers is vital within the wild too, but counting animals on the plains of Africa is more challenging. Traditionally wild counts are either ground surveys, which take people and time; or aerial surveys, that can spook the animals. Satellite remote sensing could offer a potential solution, but it’s not straight-forward. Three papers published in 2014 show the possibilities, and challenges, of using satellites to count animals.

The paper Spotting East African Mammals in Open Savannah from Space by Zheng Yang et al (2014) published on the 31st December, describes the use of very high-resolution GeoEye-1 satellite images to detecting large animals in the Maasai Mara National Reserve, Kenya. GeoEye-1’s 2m multispectral image resolution was not sufficient to detect large animals. However, when combined with the panchromatic image using a pan sharpening technique the resolution improved to 0.5m meaning adult wildebeests and zebras were 3 to 4 pixels long, and 1 to 2 pixels wide. Experienced Kenyan wildlife researchers initially visually reviewed images to develop a classification system, forming the basis of a hybrid image system, using both pixel-based and object-based image assessment approaches to determine which pixels belonged to animals. The results showed an average count error of 8.2% compared to manual counts, with an omission error rate of 6.6%, which demonstrates that satellites have potential for use in counting; it’s cheaper and less intrusive than existing methods.

The second paper was published by Seth Stapleton et al (2014) entitled Assessing Satellite Imagery as a Tool to Track Arctic Wildlife. It used 0.5m resolution imagery of Rowley Island in Foxe Basin, Canada, from Worldview-2 to monitor the island’s polar bear population. The images were corrected for terrain and solar irradiance, and an a histogram stretch to brighten darker, non-ice, areas to assist human analysts identify the bears. Two observers visually identified ‘presumed bears’ both individually and jointly; resulting in the identification of 92 presumed bears. This satellite derived figure was consistent with other models, again offering a potential cheaper and safer way of monitoring polar bears.

Finally, Peter Fretwell et al (2014) published Counting Southern Right Whales by Satellite. Also using WorldView-2, they used a 2m resolution image with eight colour bands and one panchromatic band. The images were analysed using ENVI5 and ArcGIS to identify potential and probable whales, and then visual inspection of these images showed they had identified objects of the right shape and size to be whales; resulting in the identification of 55 probable whales and 23 possible whales. Again, showing satellite images could be useful in calculating whale populations faster and more efficiently.

All three of these papers demonstrate that satellite remote sensing has potential to assist in the monitoring of animal species across the globe. However, there are also significant challenges still to overcome, for example:

  • Resolution: Currently available resolutions may not sufficient to distinguish the level of detail conservationists need, such as species identification in Africa or polar bear cubs in the Canada. However, it may be possible with very high resolution satellites such as the planned WordlView-4 from DigitalGlobe.
  • Cloud cover: The persistent nemesis of optical Earth observation imagery may hamper it’s use in certain areas or seasons.
  • Complicated environments: Further research is needed to ensure animals can be accurately distinguished from their surroundings.

Despite these reservations, the potential to offer regularly, more efficient and safer methods of survey animal populations from space means this will be a rapidly developing area of Earth observation.

The Small and Mighty Proba Missions

This week the European Space Agency announced the latest mission in the Project for OnBoard Automony (PROBA) mini-satellite programme. Proba-3 is planned to launch in four years; and will be a pair of satellites flying in close formation, 150m apart, with the front satellite creating an artificial eclipse of the sun allowing its companion views of the solar corona; normally only visible momentarily during solar eclipses.

Tamar estuary captured in October 2005, data courtesy of ESA.

Tamar estuary captured in October 2005, data courtesy of ESA.

The Proba missions are part of ESA’s In-orbit Technology Demonstration Programme, which focuses on testing, and using, innovative technologies in space. Despite Proba-3’s nomenclature, it will be the fourth mission in the Proba programme. The first, Proba-1, was launched on the 22nd October 2001 on a planned two year Earth observation (EO) mission; however despite the planned lifecycle, thirteen years later it is still flying and sending back EO data. It’s in a sun synchronous orbit with a seven-day repeat cycle and carries eight instruments. The main one is the Compact High Resolution Imaging Spectrometer (CHRIS), developed in the UK by the Space Group of Sira Technology Ltd that was later acquired by Surrey Satellite Technology Limited. CHRIS is a hyperspectral sensor that acquires a set of up to five images of a target, with different modes allowing the collection of up to 62 spectral wavebands.

Plymouth, where Pixalytics is based, and our lead consultant, Dr Samantha Lavender, have a long history with Proba-1. Rame Head point, along the coast from Plymouth, is one of the test sites for the CHRIS instrument and she’s been doing research using the data it provides for over a decade. Over Plymouth Mode 2 is used, which focuses on mapping the water at a spatial resolution of 17m; this mode was proposed by Sam back in the early days of CHRIS-Proba. The image at the top of the page, captured in October 2005, shows the Tamar estuary in the UK that separates the counties of Devon and Cornwall; for this image CHRIS was pointed further North due to planned fieldwork activities. At the bottom of the image is the thick line of the Tamar Road Bridge and below it, the thinner Brunel railway bridge. Plymouth is to the right of the bridge, and to the left is the Cornish town of Saltash.

Proba-V image of the Nile Delta in Egypt, courtesy of the Belgian PROBA-V / ESA Earth Watch programmes

Proba-V image of the Nile Delta in Egypt, courtesy of the Belgian PROBA-V / ESA Earth Watch programmes

Proba-2 was launched in 2009, carrying two solar observation experiments, two space weather experiments and seventeen other technology demonstrations. ESA returned to EO for the third mission, Proba-V, launched on the 7 May 2013; the change in nomenclature is because the V stands for vegetation sensor. It is a redesign of the ‘Vegetation’ imaging instrument carried on the French Spot satellites; it has a 350m ground resolution with a 2250km swath, and collects data in the blue, red, near-infrared and mid-infrared wavebands. It provides worldwide coverage every two days, and through its four spectral bands it can distinguish between different types of land cover. The image on the right is from Proba-V, showing the Nile delta on 2nd May 2014.

Despite their small stature all the Proba satellites are showing their resilience by remaining operational, and they’re playing a vital role in allowing innovative new technologies to be tested in space.

Can Earth Observation answer your question?

The opportunities and challenges of utilising Earth observation (EO) data played out in microcosm in our house over the weekend. On Sunday afternoon, I was watching highlights of the Formula One Singapore Grand Prix which takes place on the harbour streets of Marina Bay and is the only night race of the season. To ensure the drivers can see, there are over 1,500 light projectors installed around the circuit giving an illumination of around 3,000 lux.

Whilst watching I wondered aloud whether we’d be able to see the track from space with the additional floodlights. My idle wondering caught Sam’s interest far more than the actual race and she decided to see if she could answer the question. The entire circuit is just over five kilometres long, but it’s a loop and so an approximate two kilometre footprint; any imagery would need a spatial resolution less than this. The final difficulty is that the data needed to be this weekend, as the circuit is only floodlit for the racing.

Within a few laps Sam had identified free near real time night data available from United States National Oceanic & Atmospheric Administration (NOAA) which covered the required area and timeframe. This was from the Visible Infrared Imaging Radiometer Suite (VIIRS) using it’s Day/Night band with a 750m spatial resolution – this resolution meant we would not be able to see the outline of the track as it would be represented by only three or four pixels, but it would be interesting to see if we could identify the track feature. By the end of the race Sam had selected and downloaded the data, and so we could answer my question. However, it turned out to be not quite that easy.

VIIRS Singapore night time imagery, data courtesy of NOAA

VIIRS Singapore night time imagery, data courtesy of NOAA

NOAA data uses a slightly different format to the image processing packages we had, and we couldn’t initially see what we’d downloaded. Sam had to write some computer code to modify the packages to read the NOAA data. For anyone thinking this is an odd way to spend a Sunday evening, to Sam this was a puzzle to solve and she was enjoying herself! After some rapid coding we were able to view the image, but unfortunately the Saturday data wasn’t useful. On Monday we tried again, the Sunday race took place on a clear night and we’ve got a good image of the area, which you can see above. On the larger image you can clearly the Indonesian Islands with Jakarta shining brightly, up through the Java Sea where the lights of some ships are visible and then at the top of the image is Singapore; the zoomed in version of Singapore is the inset image.

Despite the floodlights used for the race, Singapore and some of the surrounding Malaysian cities are so bright at night that the additional lights simply contribute to the overall illumination, rather than making the track stand out. Hence the answer to my question is that the 2014 floodlit Singapore F1 street circuit can’t be distinguished from the surrounding area at this spatial resolution. Of course if we purchased high resolution imagery we may be able to see more detail, but we thought that was going a bit far for my idle wondering!

EO can answer questions like these quickly; and whilst we know not many businesses are dependent on whether the Singapore Grand Prix can be seen from space, but change this to what is the light pollution in your area, what is happening in terms of deforestation in the middle of the jungle, what phytoplankton are doing in the middle of the ocean or whatever question you might have, then EO might be able to provide the answer in a short space of time.

However, there are two main difficulties in getting the answer. Firstly, you’ve got to know where to find the data and secondly, what do with it when you get it. Currently this can be challenging without specialist knowledge, making it inaccessible for the general population. In the coming weeks, we’re going to write some blogs looking at the freely EO data available, and the easiest way of viewing it. Hopefully, this may to help you answer your own questions. In the meantime if you have questions you want answered, get in touch, we’d be happy to help.