If no-one is there when an iceberg is born, does anyone see it?

Larsen C ice Shelf including A68 iceberg. Image acquired by MODIS Aqua satellite on 12th July 2017. Image courtesy of NASA.

The titular paraphrasing of the famous falling tree in the forest riddle was well and truly answered this week, and shows just how far satellite remote sensing has come in recent years.

Last week sometime between Monday 10th July and Wednesday 12th July 2017, a huge iceberg was created by splitting off the Larsen C Ice Shelf in Antarctica. It is one of the biggest icebergs every recorded according to scientists from Project MIDAS, a UK-based Antarctic research project, who estimate its area of be 5,800 sq km and to have a weight of more a trillion tonnes. It has reduced the Larsen C ice Shelf by more than twelve percent.

The iceberg has been named A68, which is a pretty boring name for such a huge iceberg. However, icebergs are named by the US National Ice Centre and the letter comes from where the iceberg was originally sited – in this case the A represents area zero degrees to ninety degrees west covering the Bellingshausen and Weddell Seas. The number is simply the order that they are discovered, which I assume means there have been 67 previous icebergs!

After satisfying my curiosity on the iceberg names, the other element that caught our interest was the host of Earth observation satellites that captured images of either the creation, or the newly birthed, iceberg. The ones we’ve spotted so far, although there may be others, are:

  • ESA’s Sentinel-1 has been monitoring the area for the last year as an iceberg splitting from Larsen C was expected. Sentinel-1’s SAR imagery has been crucial to this monitoring as the winter clouds and polar darkness would have made optical imagery difficult to regularly collect.
  • Whilst Sentinel-1 was monitoring the area, it was actually NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite which confirmed the ‘birth’ on the 12th July with a false colour image at 1 km spatial resolution using band 31 which measures infrared signals. This image is at the top of the blog and the dark blue shows where the surface is warmest and lighter blue indicates a cooler surface. The new iceberg can be seen in the centre of the image.
  • Longwave infrared imagery was also captured by the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite on July 13th.
  • Similarly, NASA also reported that Landsat 8 captured a false-colour image from its Thermal Infrared Sensor on the 12th July showing the relative warmth or coolness of the Larsen C ice shelf – with the area around the new iceberg being the warmest giving an indication of the energy involved in its creation.
  • Finally, Sentinel-3A has also got in on the thermal infrared measurement using the bands of its Sea and Land Surface Temperature Radiometer (SLSTR).
  • ESA’s Cryosat has been used to calculate the size of iceberg by using its Synthetic Aperture Interferometric Radar Altimeter (SIRAL) which measured height of the iceberg out of the water. Using this data, it has been estimated that the iceberg contains around 1.155 cubic km of ice.
  • The only optical imagery we’ve seen so far is from the DEMIOS1 satellite which is owned by Deimos Imaging, an UrtheCast company. This is from the 14th July and revealed that the giant iceberg was already breaking up into smaller pieces.

It’s clear this is a huge iceberg, so huge in fact that most news agencies don’t think that readers can comprehend its vastness, and to help they give a comparison. Some of the ones I came across to explain its vastness were:

  • Size of the US State of Delaware
  • Twice the size of Luxembourg
  • Four times the size of greater London
  • Quarter of the size of Wales – UK people will know that Wales is almost an unofficial unit of size measurement in this country!
  • Has the volume of Lake Michigan
  • Has the twice the volume of Lake Erie
  • Has the volume of the 463 million Olympic-sized swimming pools; and
  • My favourite compares its size to the A68 road in the UK, which runs from Darlington to Edinburgh.

This event shows how satellites are monitoring the planet, and the different ways we can see the world changing.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Remote Sensing Goes Cold

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Remote sensing over the Polar Regions has poked its head above the ice recently.

On the 8th February The Cryosphere, a journal of the European Geosciences Union, published a paper by Smith et al titled ’Connected sub glacial lake drainage beneath Thwaites Glacier, West Antarctica’. It described how researchers used data from ESA’s CryoSat-2 satellite to look at lakes beneath a glacier.

This work is interesting from a remote sensing viewpoint as it is a repurposing of Cryosat-2’s mission. It’s main purpose is to measure the thickness of the ice sheets and marine ice cover using its Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter, known as SIRAL, and it can detect millimetre changes in the elevation of both ice-sheets and sea-ice.

The team were able to use this data to determine that the ice of the glacier had subsided by several metres as water had drained away from four lakes underneath. Whilst the whole process took place between June 2012 and January 2014, the majority of the drainage happened in a six month period. During this time it’s estimated that peak drainage was around 240 cubic metre per second, which is four times faster than the outflow of the River Thames into the North Sea.

We’ve previously highlighted that repurposing data – using data for more purposes than originally intended – is going to be one of the key future innovation trends for Earth Observation.

Last week, ESA also described how Sentinel-1 and Sentinel-2 data have been used over the last five months to monitor a crack in the ice near to the Halley VI research base of the British Antarctic Survey (BAS). The crack, known as Halloween Crack, is located on the Brunt ice Shelf in the Wedell Sea sector of Antarctica and was identified last October. The crack grew around 600 m per day during November and December, although it has since slowed to only one third of that daily growth.

Since last November Sentinel-2 has been acquiring optical images at each overflight, and this has been combined with SAR data from the two Sentinel-1 satellites. This SAR data will be critical during the Antarctic winter when there are only a few hours of daylight and a couple of weeks around mid-June when the sun does not rise.

This work hit the headlines as BAS decided to evacuate their base for the winter, due to the potential threat. The Halley VI base, which was only 17km from the crack, is the first Antarctic research station to be specifically designed to allow relocation to cope with this sort of movement in the ice shelf. It was already planned to move the base 23 km further inland, and this was successfully completed on the 2nd February. Further movement will depend on how the Halloween Crack develops over the winter.

Finally, the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) project was announced this week at the annual meeting of the American Association for the Advancement of Science. Professor Markus Rex outlined the project, which will sail a research vessel into the Arctic sea ice and let it get stuck so it can drift across the North Pole. The vessel will be filled with a variety of remote sensing in-situ instruments, and will aim to collect data on how the climate is changing in this part of the world through measuring the atmosphere-ice-ocean system.

These projects show that the Polar Regions have a lot of interest, and variety, for remote sensing.

Differences Between Optical & Radar Satellite Data

Ankgor Wat, Cambodia. Sentinel-2A image courtesy of ESA.

Ankgor Wat, Cambodia. Sentinel-2A image courtesy of ESA.

The two main types of satellite data are optical and radar used in remote sensing. We’re going to take a closer look at each type using the Ankgor Wat site in Cambodia, which was the location of the competition we ran on last week’s blog as part of World Space Week. We had lots of entries, and thanks to everyone who took part!

Constructed in the 12th Century, Ankgor Wat is a temple complex and the largest religious monument in the world. It lies 5.5 kilometres north of the modern town of Siem Reap and is popular with the remote sensing community due to its distinctive features. The site is surrounded by a 190m-wide moat, forming a 1.5km by 1.3km border around the temples and forested areas.

Optical Image
The picture at the top, which was used for the competition, is an optical image taken by a Multi-Spectral Imager (MSI) carried aboard ESA’s Sentinel-2A satellite. Optical data includes the visible wavebands and therefore can produce images, like this one, which is similar to how the human eye sees the world.

The green square in the centre of the image is the moat surrounding the temple complex; on the east side is Ta Kou Entrance, and the west side is the sandstone causeway which leads to the Angkor Wat gateway. The temples can be clearly seen in the centre of the moat, together with some of the paths through the forest within the complex.

To the south-east are the outskirts of Siem Reap, and the square moat of Angkor Thom can be seen just above the site. To the right are large forested areas and to the left are a variety of fields.
In addition to the three visible bands at 10 m resolution, Sentinel-2A also has:

  • A near-infrared band at 10 m resolution,
  • Six shortwave-infrared bands at 20 m resolution, and
  • Three atmospheric correction bands at 60 m resolution.

Radar Image
As a comparison we’ve produced this image from the twin Sentinel-1 satellites using the C-Band Synthetic Aperture Radar (SAR) instrument they carry aboard. This has a spatial resolution of 20 m, and so we’ve not zoomed as much as with the optical data; in addition, radar data is noisy which can be distracting.

Angkor Wat, Cambodia. SAR image from Sentinel-1 courtesy of ESA.

Angkor Wat, Cambodia. SAR image from Sentinel-1 courtesy of ESA.

The biggest advantage of radar data over optical data is that it is not affected by weather conditions and can see through clouds, and to some degree vegetation. This coloured Sentinel-1 SAR image is produced by showing the two polarisations (VV and VH i.e. vertical polarisation send for the radar signal and vertical or horizontal receive) alongside a ratio of them as red, green and blue.

Angkor Wat is shown just below centre, with its wide moat, and other archaeological structures surrounding it to the west, north and east. The variety of different landscape features around Angkor Wat show up more clearly in this image. The light pink to the south is the Cambodian city of Siem Reap with roads appearing as lines and an airport visible below the West Baray reservoir, which also dates from the Khmer civilization. The flatter ground that includes fields are purple, and the land with significant tree cover is shown as pale green.

Conclusion
The different types of satellite data have different uses, and different drawbacks. Optical imagery is great if you want to see the world as the human eye does, but radar imagery offers better options when the site can be cloudy and where you want an emphasis on the roughness of the surfaces.

Flooding Forecasting & Mapping

Sentinel-1 data for York overlaid in red with Pixalytics flood mapping layer based on Giustarini approach for the December 2015 flooding event. Data courtesy of ESA.

Sentinel-1 data for York overlaid in red with Pixalytics flood mapping layer based on Giustarini approach for the December 2015 flooding event. Data courtesy of ESA.

Media headlines this week have shouted that the UK is in for a sizzling summer with temperature in the nineties, coupled with potential flooding in August due to the La Niña weather process.

The headlines were based on the UK Met Office’s three month outlook for contingency planners. Unfortunately, when we looked at the information ourselves it didn’t exactly say what the media headlines claimed! The hot temperatures were just one of a number of potential scenarios for the summer. As any meteorologist will tell you, forecasting a few days ahead is difficult, forecasting a three months ahead is highly complex!

Certainly, La Niña is likely to have an influence. As we’ve previously written, this year has been influenced by a significant El Niño where there are warmer ocean temperatures in the Equatorial Pacific. La Niña is the opposite phase, with colder ocean temperatures in that region. For the UK this means there is a greater chance of summer storms, which would mean more rain and potential flooding. However, there are a lot of if’s!

At the moment our ears prick up with any mention of flooding, as Pixalytics has just completed a proof of concept project, in association with the Environment Agency, looking to improve operational flood water extent mapping information during flooding incidents.

The core of the project was to implement recent scientific research published by Matgen et al. (2011), Giustarini et al. (2013) and Greifeneder et al. (2014). So it was quite exciting to find out that Laura Guistarini was giving a presentation on flooding during the final day of last week’s ESA Living Planets Symposium in Prague – I wrote about the start of the Symposium in our previous blog.

Laura’s presentation, An Automatic SAR-Based Flood Mapping Algorithm Combining Hierarchical Tiling and Change Detection, was interesting as when we started to implement the research on Sentinel-1 data, we also came to the conclusion that the data needed to be split into tiles. It was great to hear Laura present, and I managed to pick her brains a little at the end of the session. At the top of the blog is a Sentinel-1 image of York, overlaid with a Pixalytics derived flood map in red for the December 2015 flooding based on the research published by Laura

The whole session on flooding, which took place on the last morning of the Symposium, was interesting. The presentations also included:

  • the use of CosmoSkyMed data for mapping floods in forested areas within Finland.
  • extending flood mapping to consider Sentinel-1 InSAR coherence and polarimetric information.
  • an intercomparison of the processing systems developed at DLR.
  • development of operational flood mapping in Norway.

It was useful to understand where others were making progress with Sentinel-1 data, and how different processing systems were operating. It was also interesting that several presenters showed findings, or made comments, related to the double bounce experienced when a radar signal is reflected off not just the ground, but another structure such as a building or tree. Again it is something we needed to consider as we were particularly looking at urban areas.

The case study of our flood mapping project was published last week on the Space for Smarter Government Programme website as they, via UK Space Agency, using the Small Business Research Initiative supported by Innovate UK, funded the project.

We are continuing with our research, with the aim of having our own flood mapping product later this year – although the news that August may have flooding means we might have to quicken our development pace!

Sentinel’s Milestone and Millstone

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

There was be a significant milestone achieved for the European Commission’s Copernicus Programme with the launch of the Sentinel-1B satellite. It was the fourth satellite launched, and will complete the first of the planned constellations as the second Sentinel-1 satellite.

It was launched on 25th April from French Guiana. In addition, to Sentinel-1B, three student cubesats were onboard the Soyuz rocket. Students from the University of Liege, Polytechnic of Turin, Italy, and the University of Aalborg have developed 10cm square cubesats as part of ESA’s ‘Fly Your Satellite!’ programme which will be deployed into orbit.

Sentinel-1B is an identical twin to Sentinel-1A which was launched on the 3rd April 2014, and they will operate as a pair constellation orbiting 180 degrees apart at an altitude of approximately 700 km. They both carry a C-band Synthetic Aperture Radar (SAR) instrument and together will cover the entire planet every six days, although the Arctic will be revisited every day and Europe, Canada and main shipping routes every three days.

Sentinel-1 data has a variety of applications including monitoring sea ice, maritime surveillance, disaster humanitarian aid, mapping for forest, water and soil management. The benefits were demonstrated this week with:

  • Issuing a video showing the drop in rice-growing productivity in Mekong River Delta over the last year; and
  • The multi-temporal colour composite of land coverage of Ireland as shown at the top of this post. It was created from 16 radar scans over 12 days during May 2015, where:
    • The blues represent changes in water or agricultural activities such as ploughing, the yellows represent urban centres, vegetated fields and forests appear in green and the reds and oranges represent unchanging features such as bare soil.

With this constellation up and working, the revisit speed has the chance to be the game changer in the uptake of space generated data.

Sadly there’s a millstone hanging around the Copernicus Programme neck hindering this change – accessing the data remains difficult for commercial organisations.

Currently, selecting and downloading Sentinel data is a painful process, one that mostly either does not work, or is so slow you give up on it! This has been created by the size of the datasets and popularity of the data that’s free to access for everyone worldwide.

There are a number of ways of getting access to this data, with varying success in our experience, including:

  • EU’s Copernicus Hub – Operational, but slow to use. Once you have selected the data to download, either manually or via a script, the process is extremely slow and often times out before completing the downloading.
  • USGS – Offers Sentinel-2, but not Sentinel-1, data via it’s EarthExplorer and Glovis interfaces. The download process is easier, but the format of Sentinel-2 makes searching a bit strange in Glovis and it’s only a partial representation of the available acquisitions.
  • The UK Collaborative Ground Segment Access, despite signing an agreement with ESA in March 2015, has not yet been made available for commercial entities.
  • It is possible to apply for access to the academically focused STFC Centre for Environmental Data Analysis (CEDA) system, which provides FTP access, and that has good download speed’s for the data that’s available.
  • Amazon’s archive of Sentinel-2 data which has good download speeds, but is cumbersome to search without the development of software i.e. scripts.

There are also further services and routes being developed to facilitate searching and downloading from the various archives, e.g., there’s a QGIS ‘Semi-Automatic Classification’ plugin and EOProc SatCat service for Sentinel-2. With the Sentinel-3A data coming online soon the situation will get more complex for those of us trying to use data from all the Sentinel missions.

Getting the satellites into space is great, but that is only the first step in widening the use of space generated data. Until the data is put into the hands of people who use it to create value and inspire people, the Sentinel data will not fulfill its full potential in widening the use of space generated data.

How to Measure Heights From Space?

Combining two Sentinel-1A radar scans from 17 and 29 April 2015, this interferogram shows changes on the ground that occurred during the 25 April earthquake that struck Nepal. Contains Copernicus data (2015)/ESA/Norut/PPO.labs/COMET–ESA SEOM INSARAP study

Combining two Sentinel-1A radar scans from 17 and 29 April 2015, this interferogram shows changes on the ground that occurred during the 25 April earthquake that struck Nepal. Contains Copernicus data (2015)/ESA/Norut/PPO.labs/COMET–ESA SEOM INSARAP study

Accurately measuring the height of buildings, mountains or water bodies is possible from space. Active satellite sensors send out pulses of energy towards the Earth, and measure the strength and origin of the energy received back enabling them to determine of the heights of objects struck by the pulse energy on Earth.

This measurement of the time it takes an energy pulse to return to the sensor, can be used for both optical and microwave data. Optical techniques such as Lidar send out a laser pulse; however within this blog we’re going to focus on techniques using microwave energy, which operate within the Ku, C, S and Ka frequency bands.

Altimetry is a traditional technique for measuring heights. This type of technique is termed Low Resolution Mode, as it sends out a pulse of energy that return as a wide footprint on the Earth’s surface. Therefore, care needs to be taken with variable surfaces as the energy reflected back to the sensor gives measurements from different surfaces. The signal also needs to be corrected for speed of travel through the atmosphere and small changes in the orbit of the satellite, before it can be used to calculate a height to centimetre accuracy. Satellites that use this type of methodology include Jason-2, which operates at the Ku frequency, and Saral/AltiKa operating in the Ka band. Pixalytics has been working on a technique to measure river and flood water heights using this type of satellite data. This would have a wide range of applications in both remote area monitoring, early warning systems, disaster relief, and as shown in the paper ‘Challenges for GIS remain around the uncertainty and availability of data’ by Tina Thomson, offers potential for the insurance and risk industries.

A second methodology for measuring heights using microwave data is Interferometric Synthetic Aperture Radar (InSAR), which uses phase measurements from two or more successive satellite SAR images to determine the Earth’s shape and topography. It can calculate millimetre scale changes in heights and can be used to monitor natural hazards and subsidence. InSAR is useful with relatively static surfaces, such as buildings, as the successive satellite images can be accurately compared. However, where you have dynamic surfaces, such as water, the technique is much more difficult to use as the surface will have naturally changed between images. Both ESA’s Sentinel-1 and the CryoSat-2 carry instruments where this technique can be applied.

The image at the top of the blog is an interferogram using data collected by Sentinel-1 in the aftermath of the recent earthquake in Nepal. The colours on the image reflect the movement of ground between the before, and after, image; and initial investigations from scientists indicates that Mount Everest has shrunk by 2.8 cm (1 inch) following the quake; although this needs further research to confirm the height change.

From the largest mountain to the smallest changes, satellite data can help measure heights across the world.

Temporal: The forgotten resolution

Time, Copyright: scanrail / 123RF Stock Photo

Time, Copyright: scanrail / 123RF Stock Photo

Temporal resolution shouldn’t be forgotten when considering satellite imagery; however it’s often neglected, with its partners of spatial and spectral resolution getting the limelight. The reason is the special relationship spatial and spectral has, where a higher spectral resolution has meant a lower spatial resolution and vice-versa, because of limited satellite disk space and transmission capabilities. Therefore, when considering imagery most people focus on their spatial or spectral needs and go with whatever best suits their needs, rarely giving temporal resolution a second thought, other than if immediate data acquisition is required.

Temporal resolution is the amount of time it takes a satellite to return to collect data for exactly the same location on Earth, also known as the revisit or recycle time, expressed as a function of time in hours or days. Global coverage satellites tend to have low earth polar, or near-polar, orbits travelling at around 27,000kph and taking around 100 minutes to circle the Earth. With each orbit the Earth rotates twenty-five degrees around its polar axis, and so on each successive orbit the ground track moves to the west, meaning it takes a couple of weeks to fully rotate, for example, Landsat has a 16 day absolute revisit time.

Only seeing the part of the Earth you want to image once every few weeks, isn’t very helpful if you want to see daily changes. Therefore, there are a number of techniques satellites use to improve the temporal resolution:

  • Swath Width– A swath is the area of ground the satellite sees with each orbit, the wider the swath the greater the ground coverage, but generally a wider swath means lower spatial resolution. A satellite with a wide swath will have significant overlaps between orbits that allows areas of the Earth to be imaged more frequently, reducing the revisit time. MODIS uses a wide swath and it images the globe every one to two days.
  • Constellations – If you have two identical satellites orbiting one hundred and eighty degrees apart you will reduce revisit times, and this approach is being used by ESA’s Sentinel missions. Sentinel-1A was launched in 2014, with its twin Sentinel-1B is due to be launched in 2016. When operating together they will provide a temporal resolution of six days. Obviously, adding more satellites to the constellations will continue to reduce the revisit time.
  • Pointing – High-resolution satellites in particular use this method, which allows the satellites to point their sensors at a particular point on earth, and so can map the same area from multiple orbits. However, pointing changes the angle the sensor looks at the Earth, and means the ground area it can observe can be distorted.
  • Geostationary Orbits – Although technically not the same, a geostationary satellite remains focussed on an area of the Earth at all times and so the temporal resolution is the number of times imagery is taken, for example, every fifteen minutes. The problem is that you can only map a restricted area.

Hopefully, this has given you a little oversight on temporal resolution, and whilst spectral and spatial resolution are important factors when considering what imagery you need; do spent a bit a time considering temporal needs too!

The Reality of Gravity

Space debris has hit the entertainment headlines recently, with the film Gravity winning various awards; it deals with the aftermath of a space debris collision with the International Space Station. So is the film just a fictional thriller or a prophetic warning?

Image: ESA Note: The debris field shown in the image is an artist's impression based on actual data. However, the debris objects are shown at an exaggerated size to make them visible at the scale shown

Image courtsey of ESA

Our recent blog noted there are currently almost 4,000 satellites in orbit; of which almost 1,200 are operational, leaving 2,800 pieces of junk metal up there. However space debris, sometimes called orbital debris or space junk, is far wider than simply old satellites; it encompasses every man-made object circling the earth. It includes upper rocket stages, results of anti-satellite weapon testing, solid rocket motor waste and anything that has broken off any satellite or spacecraft. It also includes less obvious items that have been ‘lost’ in space including a camera, a tool bag, single glove, pair of pliers and a spatula – although some of these did burn up! There is also the less tasteful astronaut urine, which for many years was dumped into space.

In total the U.S. Space Surveillance Network estimates that there are more than 21,000 pieces of space debris larger than 10cm, 500,000 pieces between 1cm and 10cm in size and 100 million pieces smaller than 1cm; all of which add up to a mass of around 6,500 tonnes orbiting the planet. Most of the debris is at altitudes of less than 2,000km, which co-insides with the height of low earth orbits accounting for fifty percent of all active satellites. With the debris travelling at speeds up to 33,500 miles per hour, any collision is serious.

The danger was demonstrated with the recent deployment of ESA’s Sentinel-1 satellite. Less than 24 hours after launch there was a warning that the low earth orbit NASA satellite, ACRIMSAT, had run out of fuel, could no longer be manoeuvred and was on a potential collision course. Sentinel-1 required an immediate manoeuvre to alter its orbital altitude – something the launch team hadn’t anticipated! In addition to active satellites, the International Space Station regularly has to alter its orbit to avoid space debris.

As more satellites are launched, last year was the greatest number in history, the potential for space debris increases. Scientists warn of the Kessler Syndrome, a scenario where the high density of objects in low earth orbit causes a cascade of collisions creating so much debris it becomes impossible to launch anything successfully into space. In our clamour to have more robust positioning and tracking and a wider variety of sensing options, we need to consider the future. End of life solutions must be given as much attention as the launch; to ensure our industry continues to develop, and grow, for the future generations of Earth Observation scientists.

Cresting Wavelength 2014

Today is the final day of the Remote Sensing and Photogrammetry Society’s (RSPSoc) annual Wavelength Conference for students and early career professionals in remote sensing and photogrammetry. This year, Pixalytics was one of the sponsors of the conference, which was well attended by students from many international and UK universities, as well as representatives from a number of commercial remote sensing service providers and consultancies.

Over the three day event, keynote speakers and student poster presentations served to illustrate the infinite number of possible applications for remote sensing. One really interesting application was presented Emily Norton, a PhD student at Bournemouth University. She is an experienced forensic anthropologist with the inforce Foundation, which is a charity focussed on providing the forensic expertise for the scientific detection, recovery and identification of victims arising from mass fatality incidents, genocide, war crimes and similar crimes against humanity. Emily has previously worked in Rwanda investigating reports of mass graves following the 1994 genocide. Usually forensic work is intelligence led, but local reports are often imprecise and spatial data is needed to pinpoint graves. Once graves are located, forensic investigation is used to support war crime tribunals and, most importantly, return remains to families for proper burial.

Following the outbreak of foot and mouth disease in 2001 in the UK, thousands of livestock animals were destroyed and buried at sites across the country. Emily has used Landsat imagery of these animal graves as a basis to study the changes in vegetation at each site; the research she’s done means these principles could be used to detect clandestine mass graves in areas of conflict. Emily won the best poster competition at this year’s conference, and will travel to Bosnia later this year to test the remote sensing method further and begin to develop a streamlined standardised approach which can be used in developing countries to support future humanitarian efforts. With global coverage, a historical archive and the ability to be used safely in remote or high risk areas; remote sensing could be a valuable tool in this area of work.

One of the consistent themes of this year’s conference is that advances in technology mean that remote sensing equipment is becoming smaller, lighter, cheaper and more accurate, enabling a wider variety of remote sensing data to be collected. One of the most interesting features of the earth observation community is that each advance in technology drives new areas of research which, in turn, uncover new uses for remote sensing data, which then demands new technology! Hopefully, ESA’s Sentinel satellites will continue this cycle and inspire a new generation of remote sensing scientists; here’s to Wavelength 2015!

Blog by Bryony Hanlon, work placement student at Pixalytics Ltd and an attendee at Wavelength 2014.