If no-one is there when an iceberg is born, does anyone see it?

Larsen C ice Shelf including A68 iceberg. Image acquired by MODIS Aqua satellite on 12th July 2017. Image courtesy of NASA.

The titular paraphrasing of the famous falling tree in the forest riddle was well and truly answered this week, and shows just how far satellite remote sensing has come in recent years.

Last week sometime between Monday 10th July and Wednesday 12th July 2017, a huge iceberg was created by splitting off the Larsen C Ice Shelf in Antarctica. It is one of the biggest icebergs every recorded according to scientists from Project MIDAS, a UK-based Antarctic research project, who estimate its area of be 5,800 sq km and to have a weight of more a trillion tonnes. It has reduced the Larsen C ice Shelf by more than twelve percent.

The iceberg has been named A68, which is a pretty boring name for such a huge iceberg. However, icebergs are named by the US National Ice Centre and the letter comes from where the iceberg was originally sited – in this case the A represents area zero degrees to ninety degrees west covering the Bellingshausen and Weddell Seas. The number is simply the order that they are discovered, which I assume means there have been 67 previous icebergs!

After satisfying my curiosity on the iceberg names, the other element that caught our interest was the host of Earth observation satellites that captured images of either the creation, or the newly birthed, iceberg. The ones we’ve spotted so far, although there may be others, are:

  • ESA’s Sentinel-1 has been monitoring the area for the last year as an iceberg splitting from Larsen C was expected. Sentinel-1’s SAR imagery has been crucial to this monitoring as the winter clouds and polar darkness would have made optical imagery difficult to regularly collect.
  • Whilst Sentinel-1 was monitoring the area, it was actually NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite which confirmed the ‘birth’ on the 12th July with a false colour image at 1 km spatial resolution using band 31 which measures infrared signals. This image is at the top of the blog and the dark blue shows where the surface is warmest and lighter blue indicates a cooler surface. The new iceberg can be seen in the centre of the image.
  • Longwave infrared imagery was also captured by the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite on July 13th.
  • Similarly, NASA also reported that Landsat 8 captured a false-colour image from its Thermal Infrared Sensor on the 12th July showing the relative warmth or coolness of the Larsen C ice shelf – with the area around the new iceberg being the warmest giving an indication of the energy involved in its creation.
  • Finally, Sentinel-3A has also got in on the thermal infrared measurement using the bands of its Sea and Land Surface Temperature Radiometer (SLSTR).
  • ESA’s Cryosat has been used to calculate the size of iceberg by using its Synthetic Aperture Interferometric Radar Altimeter (SIRAL) which measured height of the iceberg out of the water. Using this data, it has been estimated that the iceberg contains around 1.155 cubic km of ice.
  • The only optical imagery we’ve seen so far is from the DEMIOS1 satellite which is owned by Deimos Imaging, an UrtheCast company. This is from the 14th July and revealed that the giant iceberg was already breaking up into smaller pieces.

It’s clear this is a huge iceberg, so huge in fact that most news agencies don’t think that readers can comprehend its vastness, and to help they give a comparison. Some of the ones I came across to explain its vastness were:

  • Size of the US State of Delaware
  • Twice the size of Luxembourg
  • Four times the size of greater London
  • Quarter of the size of Wales – UK people will know that Wales is almost an unofficial unit of size measurement in this country!
  • Has the volume of Lake Michigan
  • Has the twice the volume of Lake Erie
  • Has the volume of the 463 million Olympic-sized swimming pools; and
  • My favourite compares its size to the A68 road in the UK, which runs from Darlington to Edinburgh.

This event shows how satellites are monitoring the planet, and the different ways we can see the world changing.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

The Day the Lights Dimmed

According to the paper published by Falchi et al in June 2016 around 80% of the world’s population suffer from light pollution. The paper, ‘The new world atlas of artificial night sky brightness’, further noted that in Europe and the USA over 99% of people experience skyglow.

Skyglow is one part of light pollution, and refers to the brightening of the night sky over inhabited areas. The prominence of this feature was demonstrated last week in Puerto Rico when a large fire in the Aguirre Power Plant, in the area of Sanlinas, caused the lights to dim across the island.

The fire began when a power switch overheated causing an oil tank to explode. The resulting fire spread over a three acre area and effected power generation and cut off water supplies. Around one and half million people lost power equating to over 40% of the island’s population and 350,000 people were cut off from water.

This power loss gave a spectacular example of the skyglow effect, as it was possible to produce comparable night time pictures from satellites. Pictures twenty-four hours apart were taken by the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard the Suomi NPP satellite. In a recent blog on the Rio Olympics, we described the instrument in detail.

Image of Puerto Rico, acquired on 21st September 2016 from the VIIRS instrument. Data courtesy of NASA/ NASA’s Earth Observatory

Image of Puerto Rico, acquired on 21st September 2016 from the VIIRS instrument. Data courtesy of NASA/ NASA’s Earth Observatory

On the right is the ‘before image’ acquired at 2.50am local time on the 21st September. On the North coast, just to the right of centre, the bright white concentration shows the light from the capital city, San Juan. This city is the centre for manufacturing, finance and tourism for the island, and the site of its key seaport. Light can also be seen around the edge of the island, which effectively maps the islands interstate highways. The power outage affected the whole island including the westerly cities of Mayagüez and Aguadilla, the southern coastal city of Ponce and Humacao on the east coast.

Image of Puerto Rico, acquired on 22nd September 2016 from the VIIRS instrument. Data courtesy of NASA/ NASA’s Earth Observatory

Image of Puerto Rico, acquired on 22nd September 2016 from the VIIRS instrument. Data courtesy of NASA/ NASA’s Earth Observatory

Compare this with the ‘after image’ to the right which was taken approximately twenty four hours later at 2.31am on the 22nd September. Power had already started coming back on by this point, but only 130,000 were connected in the first twelve hours, and so there is still a major outage. The concentration around San Juan is reduced significantly, as are the lights mapping the interstate highways. All the areas are still identifiable, but the reduction in skyglow is apparent and obvious.

Whilst the pictures of cities and islands at night can be amazing, light pollution does have negative impacts on both us and the natural world – particularly nocturnal wildlife.

These images demonstrate the impact of skyglow, and we should all look to try and reduce the amount of light pollution in own lives, cities and countries.

Night-time Treats

This image of Rio de Janeiro was acquired on the night of July 20, 2012 by the VIIRS instrument aboard the Suomi NPP satellite. Data courtesy of NASA/NASA’s Earth Observatory.

This image of Rio de Janeiro was acquired on the night of July 20, 2012 by the VIIRS instrument aboard the Suomi NPP satellite. Data courtesy of NASA/NASA’s Earth Observatory.

The Opening Ceremony of the Rio Olympics featured a plane taking off from the Maracanã Stadium and treating us to a fantastic night flight over Rio. It was a beautiful sequence to celebrate the famous Brazilian aviator Alberto Santos-Dumont, for us at Pixalytics it led to a conversation about the beauty of night-time satellite imagery!

Currently, the best source of night-time imagery comes from Visible Infrared Imaging Radiometer Suite (VIIRS) which is one of five instruments aboard the Suomi National Polar-orbiting Partnership satellite launched on 28 October 2011. Although, if you look on Twitter you’ll also see a huge number of night-time images taken by astronauts aboard the International Space Station. This data has been used as the basis of the Cities at Night citizen science project whose aim is to create a Google maps style map of the world – as the astronauts are using cameras to take photos of the places that interest them, and there is no georeferencing information, citizens identify the cities pictures.

In contrast VIIRS is an orbiting satellite and so continually collecting calibrated and georeferenced data of the whole globe. In the day VIIRS is collecting optical and temperature data over both the land and ocean, while at night it collects temperature data and the night-time imagery using the 750 m spatial resolution Day/Night Band (DNB). Working through both the night and day, the DNB needs to be calibrated through several orders of magnitude in brightness to accommodate the dramatic contrast between solar reflection and the darkness of night. Its forerunner was the uncalibrated Operational Linescan System (OLS) on the Defense Meteorological Satellite Program (DMSP) satellites, whose primary aim was to study clouds, but when its data was declassified in the 1970s it generated a lot of interest in low light night-time observations.

The DNB VIIRS images, like the one at the top of the blog, show hubs of human activity and the road arteries that connect them, and so are of special interest to the Campaign for the Protection of Rural England who use these types of maps to protect dark skies. It also enables calculations of light pollution to be made, together with indications of the associated carbon emissions. The DNB can pick up many different phenomena. For example, aurorae are visible, as well as gas flares, volcanic activity, the lights of ships, sea ice and climatological monitoring of clouds. It’s even possible to see thunderstorms, although individual lightning flashes are hard to make out in these snapshots, the glow inside clouds caused by them are evident as bright strips with DNB imagery as seen in this image from over Louisiana, USA on 4 April 2012 (Miller et al., 2013).

Another interesting discovery in 2012 was the presence of a faint ‘nightglow’ in the upper atmosphere on moonless night over the Pacific. The DNB team were aiming to collect scenes of complete darkness for calibration purposes, but they found clouds were still clearly visible. This was due to an assortment of photochemical reactions, especially of the molecule fragment hydroxyl, which allows this nightglow to pick up subtle atmospheric phenomena such as gravity waves and the tops of anvil clouds.

Here we’ve gone from an aviation image inspired from 1903 to modern satellites, all via the Rio Olympics. It’s amazing where space can take you!

 

Blog written by Dr Louisa Reynolds and Andrew Lavender from Pixalytics Ltd.

Ocean Colour Cubes

August 2009 Monthly Chlorophyll-a Composite; data courtesy of the ESA Ocean Colour Climate Change Initiative project

August 2009 Monthly Chlorophyll-a Composite; data courtesy of the ESA Ocean Colour Climate Change Initiative project

It’s an exciting time to be in ocean colour! A couple of weeks ago we highlighted the new US partnership using ocean colour as an early warning system for harmful freshwater algae blooms, and last week a new ocean colour CubeSat development was announced.

Ocean colour is something very close to our heart; it was the basis of Sam’s PhD and a field of research she is highly active in today. When Sam began studying her PhD, Coastal Zone Color Scanner (CZCS) was the main source of satellite ocean colour data, until it was superseded by the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) that became the focus of her role at Plymouth Marine Laboratory.

Currently, there are a number ocean colour instruments in orbit:

  • NASA’s twin MODIS instruments on the Terra and Aqua satellites
  • NOAA’s Visible Infrared Imager Radiometer Suite (VIIRS)
  • China’s Medium Resolution Spectral Imager (MERSI), Chinese Ocean Colour and Temperature Scanner (COCTS) and Coastal Zone Imager (CZI) onboard several satellites
  • South Korea’s Geostationary Ocean Color Imager (GOCI)
  • India’s Ocean Colour Monitor on-board Oceansat-2

Despite having these instruments in orbit, there is very limited global ocean colour data available for research applications. This is because the Chinese data is not easily accessible outside China, Oceansat-2 data isn’t of sufficient quality for climate research and GOCI is a geostationary satellite so the data is only for a limited geographical area focussed on South Korea. With MODIS, the Terra satellite has limited ocean colour applications due to issues with its mirror and hence calibration; and recently the calibration on Aqua has also become unstable due to its age. Therefore, the ocean colour community is just left with VIIRS; and the data from this instrument has only been recently proved.

With limited good quality ocean colour data, there is significant concern over the potential loss of continuity in this valuable dataset. The next planned instrument to provide a global dataset will be OLCI onboard ESA’s Sentinel 3A, due to be launched in November 2015; with everyone having their fingers crossed that MODIS will hang on until then.

Launching a satellite takes time and money, and satellites carrying ocean colour sensors have generally been big, for example, Sentinel 3A weighs 1250 kg and MODIS 228.7 kg. This is why the project was announced last week to build two Ocean Colour CubeSats is so exciting; they are planned to weigh only 4 kg which reduces both the expense and the launch lead time.

The project, called SOCON (Sustained Ocean Observation from Nanosatellites), will see Clyde Space, from Glasgow in the UK, will build an initial two prototype SeaHawk CubeSats with HawkEye Ocean Colour Sensors, with a ground resolution of between 75 m and 150 m per pixel to be launched in early 2017. The project consortium includes the University of North Carolina, NASA’s Goddard Space Flight Centre, Hawk Institute for Space Sciences and Cloudland Instruments. The eventual aim is to have constellations of CubeSats providing a global view of both ocean and inland waters.

There are a number of other planned ocean colour satellite launches in the next ten years including following on missions such as Oceansat-3, two missions from China, GOCI 2, and a second VIIRS mission.

With new missions, new data applications and miniaturised technology, we could be entering a purple patch for ocean colour data – although purple in ocean colour usually represents a Chlorophyll-a concentration of around 0.01 mg/m3 on the standard SeaWiFS colour palette as shown on the image at the top of the page.

We’re truly excited and looking forward to research, products and services this golden age may offer.

Goodbye HICO, Hello PACE – Ocean Colour’s Satellite Symmetry

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

Ocean colour is the acorn from which Pixalytics eventually grew, and so we were delighted to see last week’s NASA announcement that one of their next generation ocean colour satellites is now more secure with a scheduled launched for 2022.

Unsurprisingly the term ocean colour refers to the study of the colour of the ocean, although in reality it’s a name that includes a suite of different products, with the central one for the open oceans being the concentration of phytoplankton. Ocean colour is determined by the how much of the sun’s energy the ocean scatters and absorbs, which in turn is dependent on the water itself alongside substances within the water that include phytoplankton and suspended sediments together with dissolves substances and chemicals. Phytoplankton can be used a barometer of the health of the oceans; in that phytoplankton are found where nutrient levels are high and oceans with low nutrients have little phytoplankton. Sam’s PhD involved the measurement of suspended sediment coming out of the Humber estuary back in 1995, and it’s remained an active field of her research for the last 20 years.

Satellite ocean colour remote sensing began with the launch of NASA’s Coastal Zone Colour Scanner (CZCS) on the 24th October 1978. It had six spectral bands, four of which were devoted to ocean colour, and a spatial resolution of around 800m. Despite only having an anticipated lifespan of one year, it operated until the 22nd June 1986 and has been used as a key dataset ever since. Sadly, CZCS’s demise marked the start of a decade gap in NASA’s ocean colour data archive.

Although there were some intermediate ocean colour missions, it was the launch of the Sea-viewing Wide Field-of-view (SeaWiFS) satellite that brought the next significant archive of ocean colour data. SeaWiFS had 8 spectral bands optimized for ocean colour and operated at a 1 km spatial resolution. One of Sam’s first jobs was developing a SeaWiFS data processor, and the satellite collected data until the end of its mission in December 2010.

Currently, global ocean colour data primarily comes from either NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) on-board the twin Aqua and Terra satellites, or the Visible Infrared Imaging Radiometer Suite (VIIRS) which is on a joint NOAA / NASA satellite called Suomi NPP. MODIS has 36 spectral bands and spatial resolution ranging from 250 to 1000 m; whilst VIIRS has twenty two spectral bands and a resolution of 375 to 750 m.

Until recently, there was also the ONR / NRL / NASA Hyperspectral Imager for the Coastal Ocean (HICO) mission on-board the International Space Station. It collected selected coastal region data with a spectral resolution range of 380 to 960nm and 90m spatial resolution. It was designed to collect only one scene per orbit and has acquired over 10,000 such scenes since its launch. However, unfortunately it suffered during a solar storm in September 2014. Its retirement was officially announced a few days ago with the confirmation that it wasn’t possible to repair the damage.

In the same week we wave goodbye to HICO, NASA announced the 2022 launch of the Pre-Aerosol and ocean Ecosystem (PACE) mission in a form of ocean colour symmetry. PACE is part of the next generation of ocean colour satellites, and it’s intended to have an ocean ecosystem spectrometer/radiometer called built by NASA’s Goddard Space Flight Centre and will measure spectral wavebands from ultraviolet to near infrared. It will also have an aerosol/cloud polarimeter to help improve our understanding of the flow, and role, of aerosols in the environment.

PACE will be preceded by several other missions with an ocean colour focus including the European Sentinel-3 mission within the next year; it will have an Ocean and Land Colour Instrument with 21 spectral bands and 300 m spatial resolution, and will be building on Envisat’s Medium Resolution Imaging Spectrometer (MERIS) instrument. Sentinel-3 will also carry a Sea and Land Surface Temperature Radiometer and a polarimeter for mapping aerosols and clouds. It should help to significantly improve the quality of the ocean colour data by supporting the improvement of atmospheric correction.

Knowledge the global phytoplankton biomass is critical to understanding the health of the oceans, which in turn impacts on the planet’s carbon cycle and in turn affects the evolution of our planet’s climate. A continuous ocean colour time series data is critical to this, and so we are already looking forward to the data from Sentinel-3 and PACE.