Are you Celebrating Earth Day!

Animation of the biosphere created using SeaWiFS data; courtesy of NASA/OBPG

Animation of the biosphere created using SeaWiFS data; courtesy of NASA/OBPG

Did you know today, the 22nd April, is the globally celebrated International Mother Earth Day? Also known simply as Earth Day, over one billion people participate in this event every year, and 2015 is the 45th Anniversary.

The first Earth Day took place in 1970 in America, when approximately 20 million citizens got involved in rallies to show support for environmental reform. This level of backing was one of the key factors that led to the creation of the US Environmental Protection Agency on the 2nd December later that year, together with the passing of a variety of environmental legislation. Earth Day continued to grow in popularity with particularly big celebrations in 1990 and 2000; then in 2009 the United Nations passed a resolution designated the 22nd April as International Mother Earth Day. The resolution acknowledged that the Earth and its ecosystems are our home, and in order to achieve a balance among the economic, social, and environmental needs of present and future generations, it was necessary to promote harmony with nature and the Earth.

Six years later almost two hundred countries celebrate the event, which is co-ordinated by the Earth Day Network, and this year’s theme is ‘It’s Our Turn To Lead.’ and has three key messages:

  • Sustainable Development: Ensuring the future economic development of the world is built on a sustainable, low carbon footing.
  • Making Everyone’s Voices Heard: Getting world leaders to pay attention to the voices across the world who want change.
  • Getting A Global Environmental Treaty: Making the 2015 United Nations Climate Change Conference, to held at the start of December in Paris, the one that secures a binding, global climate treaty.

There are events both online and all over the world, you can check what is happening close to you on this website.

In the modern calendar, there are lot of ‘Days’ for lots of things and you may wonder whether they worth supporting; well consider what the first Earth Day achieved in 1970. So what do you think? Is it worth it standing up and having your voice heard for Earth Day in 2015?

Ocean Colour Partnership Blooms

Landsat 8 Natural Colour image of Algal Blooms in Lake Erie acquired on 01 August 2014. Image Courtesy of NASA/USGS.

Landsat 8 Natural Colour image of Algal Blooms in Lake Erie acquired on 01 August 2014. Image Courtesy of NASA/USGS.

Last week NASA, NOAA, USGS and the US Environmental Protection Agency announced a $3.6 million partnership to use satellite data as an early warning system for harmful freshwater algae blooms.

An algae bloom refers to a high concentration of micro algae, known as phytoplankton, in a body of water. Blooms can grow quickly in nutrient rich waters and potentially have toxic effects. Shellfish filter large quantities of water and can concentrate the algae in their tissues, allowing it to enter the marine food chain and potentially causing a risk to human consumption. Blooms can also contaminate drinking water. For example, last August over 40,000 people were banned from drinking water in Toledo, Ohio, after an algal bloom in Lake Erie.

The partnership will use the satellite remote sensing technique of ocean colour as the basis for the early warning system.  Ocean colour isn’t a new technique, it has been recorded as early as the 1600s when Henry Hudson noted in his ship’s log that a sea pestered with ice had a black-blue colour.

Phytoplankton within algae blooms are microscopic, some only 1,000th of a millimetre in size, and so it’s not possible to see individual organisms from space. Phytoplankton contain a photosynthetic pigment visible with the human eye, and in sufficient quantities this material can be measured from space. As the phytoplankton concentration increases the reflectance in the blue waveband decreases, whilst the reflectance in the green waveband increases slightly. Therefore, a ratio of blue to green reflectance can be used to derive quantitative estimates of the concentration of phytoplankton.

The US agency partnership is the first step in a five-year project to create a reliable and standard method for identifying blooms in US freshwater lakes and reservoirs for the specific phytoplankton species, cyanobacteria. To detect blooms it will be necessary to study local environments to understand the factors that influence the initiation and evolution of a bloom.

It won’t be easy to create this methodology as inland waters, unlike open oceans, have a variety of other organic and inorganic materials suspended in the water through land surface run-off, which will also have a reflectance signal. Hence, it will be necessary to ensure that other types of suspended particulate matter are excluded from the prediction methodology.

It’s an exciting development in our specialist area of ocean colour. We wish them luck and we’ll be looking forward to their research findings in the coming years.

Lidar: From space to your garage and pocket

Lidar data overlaid on an aerial photo for Pinellas Point, Tampa Bay, USA. Data courtesy of the NASA Experimental Airborne Advanced Research Lidar (EAARL), http://gulfsci.usgs.gov/tampabay/data/1_lidar/index.html

Lidar data overlaid on an aerial photo for Pinellas Point, Tampa Bay, USA. Data courtesy of the NASA Experimental Airborne Advanced Research Lidar (EAARL), http://gulfsci.usgs.gov/tampabay/data/1_lidar/index.html

Lidar isn’t a word most people use regularly, but recent developments in the field might see a future where is becomes part of everyday life.

Lidar, an acronym for LIght Detection And Ranging, was first developing in the 1960’s and is primarily a technique for measuring distance; however, other applications include atmospheric Lidar which measures clouds, particles and gases such as ozone. The system comprises of a laser, a scanner and GPS position receiving, and it works by emitting a laser pulse towards a target, and measuring the time it takes for the pulse to return.

There are two main types of Lidar used within remote sensing for measuring distance, topographic and bathymetric; topographic Lidar uses a near infrared laser to map land, while bathymetric Lidar uses water-penetrating green light to measure the seafloor. The image at the top of the blog is a bathymetric Lidar overlaying an aerial photograph Pinellas Point, Tampa Bay in the USA, showing depths below sea level in metres. Airborne terrestrial Lidar applications have also been expanded to include measuring forest structures and tree canopies mapping; whilst there’s ground based terrestrial laser scanners for mapping structures such as buildings.

As a user getting freely accessible airborne Lidar data isn’t easy, but there are some places that offer datasets including:

Spaceborne terrestrial Lidar has been limited, as it has to overcome a number of challenges:

  • It’s an active remote sensing technique, which means it requires a lot more power to run, than passive systems and for satellites this means more cost.
  • It’s an optical system that like all optical systems is affected by cloud cover and poor visibility, although interestingly it works more effectively at night, as the processing doesn’t need to account for the sun’s reflection.
  • Lidar performance decreases with inverse square of the distance between the target and the system.
  • Lidar collects individual points, rather than an image, and images are created by combining lots of individual points. Whilst multiple overflies are possible quickly in a plane, with a satellite orbiting the Earth you’re effectively collecting lines of points over a number of days, which takes time.

The only satellite that studied the Earth’s surface using Lidar is NASA’s Ice, Cloud and Land Elevation Satellite – Geoscience Laser Altimeter system (IceSAT-GLAS); launched in 2003, it was decommissioned in 2010. It measured ice sheet elevations and changes, together with cloud and aerosol height profiles, land elevation and vegetation cover, and sea ice thickness; and you find its data products here. IceSAT-GLAS 2 is scheduled for launch in 2017. The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), part of the satellite A-Train, is a joint NASA and CNES mission launched in 2006. Originally designed as an atmospheric focused Lidar, it has since developed marine applications that led to the SABOR campaign we discussed in previous blog.

Beyond remote sensing, Lidar may become part of every household in the future, if recent proof-of-concepts come to fruition. The Google self-drive car uses a Lidar as part of its navigation system to generate a 3D maps of the surrounding environment. In addition, research recently published in Optics Express, by Dr. Ali Hajimiri of California Institute of Technology has described the potential of a tiny Lidar device capable of turning mobile phones into 3D scanning devices. Using a nanophotonic coherent imager, the proof-of-concept device has put together a 3-D image of the front of a U.S. penny from half a meter away, with 15-μm depth resolution and 50-μm lateral resolution.

Lidar has many remote sensing and surveying applications, however, in the future we all could have lasers in our garage and pockets.

Collaborative Earth Observation

This image combines two Sentinel-1A radar scans from 3 and 15 January 2015 to show ice velocities on outlet glaciers of Greenland’s west coast. Courtesy of Copernicus data (2015)/ESA/Enveo

This image combines two Sentinel-1A radar scans from 3 and 15 January 2015 to show ice velocities on outlet glaciers of Greenland’s west coast. Courtesy of Copernicus data (2015)/ESA/Enveo

Establishing Earth observation systems are large and expensive projects with the combination of satellite development and launch alongside the ground based infrastructure, but the direct Earth observation community itself is fairly small. Working collaboratively and in partnerships can therefore help leverage initiatives, funding, research and publicity to demonstrate the value, and benefits, of our industry to the wider world.

Last week saw the announcement of three international collaborations for the UK, two at a national level and one at a local Pixalytics level! Firstly, the UK Space Agency announced 7 new collaborative projects between UK companies and international partners, funded through the International Partnerships Space Programme to develop satellite technology and applications in emerging economies.

The projects included e-learning solutions for schools in Tanzania, developing satellite air navigation, low cost telecommunications CubeSats, enhancing digital connectivity in Kenya and developing instruments for the next generation of meteorological and disaster management satellites. They were also two Earth Observation specific projects:

  • Enabling Kazakhstan’s Earth observation capability by developing and testing ground receiving stations ahead of the planned 2016 launch of the KazSTSAT small satellite mission, which will produce over 70 gigabytes of data daily.
  • Oceania Pacific Recovery and Protection in Disaster (RAPID) system which will aim to improve the use of satellite data in the aftermath of natural disasters, by getting critical decision influencing information to people in the field as quickly as possible.

The second collaboration was the UK signing the Ground Segment Cooperation agreement with ESA for the EU’s Copernicus programme. This sees the establishment of a data hub in Harwell to provide UK users with easier access to the free and publicly available data from the Copernicus Sentinel missions, and a wide range of complementary missions. The Sentinel missions will form the backbone of this data, with 14 planned satellite launches by 2025; eventually providing around 8 terabytes of data daily. Launched in 2014, Sentinel-1A is the first mission and carries a C-band Synthetic Aperture Radar (SAR) instrument providing all-weather, day-and-night imagery of the Earth’s surface; it is producing some stunning images including the one at the top of this blog. Next up will be Sentinel-2A this summer which will offer optical data across 13 spectral bands, with 4 bands at 10 m spatial resolution, 6 bands at 20 m and 3 bands at 60 m.

The final collaborative partnership is closer to home; as Pixalytics is delighted to announce that we have an international PhD student, through the European Union’s Erasmus Programme, coming to work with us over the summer.

Remote sensing and Earth observation are becoming increasingly collaborative, and is only likely to continue in the future. Everyone should encourage and support these developments, as working together will achieve much more than working alone.

Goodbye HICO, Hello PACE – Ocean Colour’s Satellite Symmetry

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

HICO™ Data, image of Hong Kong from the Oregon State University HICO Sample Image Gallery, provided by the Naval Research Laboratory

Ocean colour is the acorn from which Pixalytics eventually grew, and so we were delighted to see last week’s NASA announcement that one of their next generation ocean colour satellites is now more secure with a scheduled launched for 2022.

Unsurprisingly the term ocean colour refers to the study of the colour of the ocean, although in reality it’s a name that includes a suite of different products, with the central one for the open oceans being the concentration of phytoplankton. Ocean colour is determined by the how much of the sun’s energy the ocean scatters and absorbs, which in turn is dependent on the water itself alongside substances within the water that include phytoplankton and suspended sediments together with dissolves substances and chemicals. Phytoplankton can be used a barometer of the health of the oceans; in that phytoplankton are found where nutrient levels are high and oceans with low nutrients have little phytoplankton. Sam’s PhD involved the measurement of suspended sediment coming out of the Humber estuary back in 1995, and it’s remained an active field of her research for the last 20 years.

Satellite ocean colour remote sensing began with the launch of NASA’s Coastal Zone Colour Scanner (CZCS) on the 24th October 1978. It had six spectral bands, four of which were devoted to ocean colour, and a spatial resolution of around 800m. Despite only having an anticipated lifespan of one year, it operated until the 22nd June 1986 and has been used as a key dataset ever since. Sadly, CZCS’s demise marked the start of a decade gap in NASA’s ocean colour data archive.

Although there were some intermediate ocean colour missions, it was the launch of the Sea-viewing Wide Field-of-view (SeaWiFS) satellite that brought the next significant archive of ocean colour data. SeaWiFS had 8 spectral bands optimized for ocean colour and operated at a 1 km spatial resolution. One of Sam’s first jobs was developing a SeaWiFS data processor, and the satellite collected data until the end of its mission in December 2010.

Currently, global ocean colour data primarily comes from either NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) on-board the twin Aqua and Terra satellites, or the Visible Infrared Imaging Radiometer Suite (VIIRS) which is on a joint NOAA / NASA satellite called Suomi NPP. MODIS has 36 spectral bands and spatial resolution ranging from 250 to 1000 m; whilst VIIRS has twenty two spectral bands and a resolution of 375 to 750 m.

Until recently, there was also the ONR / NRL / NASA Hyperspectral Imager for the Coastal Ocean (HICO) mission on-board the International Space Station. It collected selected coastal region data with a spectral resolution range of 380 to 960nm and 90m spatial resolution. It was designed to collect only one scene per orbit and has acquired over 10,000 such scenes since its launch. However, unfortunately it suffered during a solar storm in September 2014. Its retirement was officially announced a few days ago with the confirmation that it wasn’t possible to repair the damage.

In the same week we wave goodbye to HICO, NASA announced the 2022 launch of the Pre-Aerosol and ocean Ecosystem (PACE) mission in a form of ocean colour symmetry. PACE is part of the next generation of ocean colour satellites, and it’s intended to have an ocean ecosystem spectrometer/radiometer called built by NASA’s Goddard Space Flight Centre and will measure spectral wavebands from ultraviolet to near infrared. It will also have an aerosol/cloud polarimeter to help improve our understanding of the flow, and role, of aerosols in the environment.

PACE will be preceded by several other missions with an ocean colour focus including the European Sentinel-3 mission within the next year; it will have an Ocean and Land Colour Instrument with 21 spectral bands and 300 m spatial resolution, and will be building on Envisat’s Medium Resolution Imaging Spectrometer (MERIS) instrument. Sentinel-3 will also carry a Sea and Land Surface Temperature Radiometer and a polarimeter for mapping aerosols and clouds. It should help to significantly improve the quality of the ocean colour data by supporting the improvement of atmospheric correction.

Knowledge the global phytoplankton biomass is critical to understanding the health of the oceans, which in turn impacts on the planet’s carbon cycle and in turn affects the evolution of our planet’s climate. A continuous ocean colour time series data is critical to this, and so we are already looking forward to the data from Sentinel-3 and PACE.

British Science Won’t Be Eclipsed

Hawthorn leaves opening in Plymouth on 18th March 2015

Hawthorn leaves opening in Plymouth on 18th March 2015

We’re celebrating science in this blog, as it’s British Science Week in the UK! Despite its name British Science Week is actually a ten day programme celebrating science, technology, engineering, and maths (STEM). The week is co-ordinated by the British Science Association, a charity founded in 1831.

The British Science Association, like ourselves at Pixalytics, firmly believe that science should be at heart of society and culture and have the desire to inform, educate, and inspire people to get interested and involved in science. They promote their aims by supporting a variety of conferences, festivals, awards, training and encouraging young people to get involved in STEM subjects.

British Science week is one of their major annual festivals, and has hundreds of events running up and down the country. The website has a search facility, so you can see what events are running locally. Down here in Plymouth, the events include Ocean Science at The National Marine Aquarium, tomorrow at Museum & Art Gallery learn about the science behind the headlines and on Saturday, also at the Museum, an animal themed day including some real mini-beasts from Dartmoor Zoo – the place that inspired the 2011 film ‘We Bought A Zoo’, which starred Matt Damon and Scarlett Johnansson.

If you can’t get to any of the events in your local area, British Science Week is also promoting two citizen’s science projects:

  • Nature’s Calendar run by the Woodland Trust, asking everyone to look out for up to six common natural events to see how fast spring is arriving this year. They want to be informed of your first sightings of the orange tipped butterfly, the 7-spot ladybird, frog spawn, oak leaves, Hawthorn leaves, and Hawthorn flowers. This will continue a dataset which began in 1736, and we thought the Landsat archive was doing well.
  • Worm Watch Lab – A project to help scientists better understand how our brain works by observing the egg laying behaviour of nematode worms. You watch a 30 second video, and click a key if you see a worm lay an egg. We’ve watched a few and are yet to see the egg laying moment, but all the video watching is developing a lot of datasets for the scientists.

If you are interested in Citizen Science and go to sea, why not get involved in the citizen science work we support, by taking part in the Secchi Disk Project. Phytoplankton underpin the marine food chain and is particularly sensitive to changes in sea-surface temperatures, so this project aims to better understand their current global phytoplankton abundance. You do this by lowering a Secchi disk, a plain white disk attached to a tape measure, over the side of a boat and then recording the depth below the surface where it disappears from sight. This measurement is uploaded to the website and helps develop a global dataset of seawater clarity, which turn indicates the amount of phytoplankton at the sea surface. All the details on how to get involved are on the website.

On Friday, nature is getting involved by providing a partial solar eclipse over the UK. Starting at around 8.30am the moon will take about an hour to get to the maximum effect where the partial eclipse will be visible to the majority of the country – although the level of cloud will determine exactly what you see. Plymouth will be amongst the first places in the country to see the maximum effect around 9.23am – 9.25am, however the country’s best views will be on the Isle of Lewis in Scotland with a 98% eclipse predicted. The only two landmasses who will see a total eclipse will be the Faroe Islands and the Norwegian arctic archipelago of Svalbard. The last total eclipse in the UK was on the 24th August 1999, and the next one isn’t due until 23 September 2090!

Although the eclipse is a spectacular natural event, remember not to look directly at the sun, as this can damage your eyes. To view the eclipse wear a pair of special eclipse glasses, use a pinhole camera or watch it on the television!

We fully support British Science Week, it’s a great idea and we hope it will inspire more people to get involved in science.

Did you know remote sensing goes extra-terrestrial?

Ceres captured by NASA's Dawn spacecraft on 19 Feb 2015. Image courtesy NASA/JPL-Caltech/UCLA/MPS/DLR/IDA

Ceres captured by NASA’s Dawn spacecraft on 19 Feb 2015.
Image courtesy NASA/JPL-Caltech/UCLA/MPS/DLR/IDA

If you didn’t realise remote sensing of other planets and space objects occurs, you’re not alone. Remote sensing is playing an important role in helping us understand how our planet, and our universe, was created; however this isn’t celebrated much outside, or even within, the remote sensing community. We discussed this topic when ESA’s Rosetta arrived at Comet 67P, and it surfaced again last week when NASA’s Dawn spacecraft went into orbit around the dwarf planet, Ceres, which lies 38 000 miles away, between Mars and Jupiter.

Dawn’s mission is to study Ceres and the asteroid Vesta, which it orbited during 2011 and 2012, to develop our understanding of early solar system formation. There was a lot of media attention about Dawn’s arrival at Ceres, as it’s the first spacecraft to visit a dwarf planet and also the first to orbit two different non-earth objects. The technical and engineering feat to get Dawn to Vesta and Ceres is amazing, but the science to acquire, and interpret, the data is pure remote sensing. However, you rarely see it described as such within the headlines.

Dawn carries three scientific instruments:

  1. A camera, designed by the Max Planck Institute for Solar System Research in Germany, which will provide both three colour and panchromatic images, and when it descends into a low orbit around Ceres it will offer 62m spatial resolution. It can use 7 different colour filters, detect near-infrared energy and has an 8 gigabyte internal memory. As the camera is vital to both the navigation, and the science, side of the mission, Dawn carries two identical, but physically separate, versions.
  2. A Visible and Infrared Mapping Spectrometer (VIR-MS) designed and built by Galileo Avionica in Italy to provide surface maps. The instrument has a high spectral resolution of between 0.25 – 1µm in the visible light range, and 0.95 – 5µm in infrared, and has 6 gigabits of internal memory. Interestingly, it was based on the VIRTIS instrument carried by Rosetta to map Comet 67P.
  3. Gamma Ray and Neutron Detector (GRaND) — The instrument has 21 sensors, a wide field of view and produces maps of Ceres measuring the rock forming elements, trace elements, radioactive elements as well as Hydrogen, Carbon and Nitrogen. It was developed by the Los Alamos National Laboratory in the United States, and unlike the other two instruments has no internal storage.

Supporting these instrument measurements will be various radiometric and navigational data to help determine the gravitational field. The fundamental principles of remote sensing – measuring the reflected energy of the planet to determine what is on the surface – is right at the heart of Dawn’s mission. So why isn’t the remote sensing community shouting more about it?

We’re probably as guilty as everyone else here; we refer to Pixalytics as either a remote sensing company and/or an Earth observation company. Is it this association to Earth, which means we don’t always acknowledge the work, and achievements, beyond our planet?

Remote sensing is leading the way in enhancing knowledge about how the universe began; this is our scientific field that is helping make this possible. So let’s make some noise for the remote sensing community, elbow the space engineers out of the way to get ourselves into the news and let everybody else know what remote sensing can do!

Landsat Showing What The Eye Can’t See

Landsat 8 True colour composite of Paris from 11/11/14. Courtesy NASA/USGS.

Landsat 8 True colour composite of Paris from 11 November 2014. Courtesy NASA/USGS.

In our recent blog we described the five simple steps to select, download and view LandsatLook Natural Colour Images. However, did you know that the Natural Colour Image isn’t actually a single image? Instead, it’s a combination of three separate images!

This is because remote sensing works by exploiting the fact that the Earth’s surfaces, and the substances on it, reflect electromagnetic energy in different ways. Using different parts of the electromagnetic spectrum makes it possible to see details, features and information that aren’t obvious to the naked eye. Some remote sensing satellites carry instruments that can measure more than part of the electromagnetic spectrum, with each different measurement known as a spectral band.

Landsat 8 currently has two instruments, measuring eleven different spectral bands:

  • Three visible light bands that approximate red, green and blue
  • One near infrared band
  • Two shortwave infrared bands
  • Two thermal bands used for sensing temperature
  • Panchromatic band with a higher spatial resolution
  • The two final bands focus on coastal aerosols and cirrus clouds.

Combing the red, green and blue bands produces a single image that is very similar to what your eye would see; and this composite is the Natural Colour Image product that Landsat offers. However, you can also create your own colour composites using Image Processing Software, as Landsat offers the possibility of downloading an image for each of the individual spectral bands, known as the Level 1 GeoTIFF files.

Once imported into an image processing package, it’s straightforward to create different composites by combining different variations of the spectral bands. For example, combing the red, green and blue bands creates an image like the one at top of the blog showing the eastern edge of Paris, with the Bois de Vincennes, the largest public park in Paris, on the left hand side.

This image has colours your eyes expect to see, for example, trees are green, water is blue, etc, known as a true colour or RGB composite. Combining other spectral bands produces images where the colours are different to what you would expect, these are known as false colour composites. As they use different parts of the electromagnetic spectrum, the surface of the earth reacts differently to the light and allows features hidden when showing true colour to become far more prominent.

Landsat 8 False colour composite of Paris from 11 November 2014. Courtesy NASA/USGS.

Landsat 8 False colour composite of Paris from 11 November 2014. Courtesy NASA/USGS.

An example of a false composite can be seen on the right, it uses the near infrared, red and green bands. Like in the RGB image, the park is easily distinguishable from the surrounding Paris; but in the false colour image, the park’s water features of the Lac Daumesnil and the Lac des Minimes have become visible as black swirls.

Landsat 8 False colour composite of Paris from 11 November 2014. Courtesy NASA/USGS.

Landsat 8 False colour composite of Paris from 11 November 2014. Courtesy NASA/USGS.

A second example of a false colour composite is shown on the right, which this time combines the near infrared, shortwave infrared 2 and the coastal aerosol band. In this case, the vegetation of Paris appears orange and jumps out of the image when compared to urbanisation shown in blue.

Using different combinations of spectral bands is just one remote sensing technique to create valuable information and knowledge from an image. However, every satellite measures different spectral bands and you need to be aware of what you are looking at. For example, we’ve described Landsat 8 in this blog, previous Landsat missions have measured similar, but slightly different spectral bands; full details of all Landsat missions and their spectral bands can be found here.

Using the individual spectral bands, rather than relying on the set Landsat products, means you may gain new insights into the area you are looking at and you can great some fantastic images. You can literally make things appear before your eyes!

Five Landsat Quirks You Should Know

South West England from the 8th December 2014. Landsat 7 imagery courtesy of NASA Goddard Space Flight Center and U.S. Geological Survey

South West England from the 8th December 2014. Landsat 7 imagery courtesy of NASA Goddard Space Flight Center and U.S. Geological Survey

If you’ve started using Landsat after our five simple steps blog last week, or perhaps you’ve used its imagery for awhile, you may have come across, what we’ll call, quirks of Landsat. These may be things you didn’t understand, things that confused you or where you thought you’d done something wrong. This week we’re going to try to demystify some of the common quirks and questions with using Landsat data and imagery.

Quirk One: What do the WRS path and row numbers mean?
The Worldwide Reference System (WRS) is what Landsat uses to map its orbits around the world, and is defined by sequential path and row numbers. Despite its name, there are in fact two versions of the WRS; WRS-1 that’s used for Landsat’s 1-3, and WRS-2 for the rest of the missions.

The paths are a series of vertical-ish tracks going from east to west, where Path 001 crosses the equator at 65.48 degrees west Longitude. In WRS-1, there are 251 tracks, whereas the instruments in Landsat 4 and beyond have a wider swath width and only require 233 tracks to cover the globe. Both WRS-1 and WRS-2 use the same 119 Rows, where Row 001 starts near the North Pole at Latitude 80 degrees, 1 minute and 12 seconds north , Row 60 coincides with the Equator at Latitude 0, and row 119 mirrors the start at Latitude 80 degrees, 1 minute and 12 seconds south. A combination of path and row numbers gives a unique reference within Landsat, the path number always comes first, followed by the row number. For example, 204-025 is the WRS-2 path and row for Plymouth.

There are maps available of the paths and rows. However, there is also handy website from USGS that converts path and row numbers to Latitude and Longitude and vice versa; it’s accompanied by a map so you can tell you’ve got the area you want!

Quirk Two: My image has a minus one percent cloud cover
This one can be confusing! On the GloVis image selector you have the option to specify the maximum percentage of cloud cover on your image. Selecting 50% means up to 50% of the image could be cloud, and selecting 0% means no cloud at all.

Cloud cover is calculated using both the optical and thermal bands, and therefore as any Landsat imagery taken using the Multispectral Scanner System (MSS) does not include a thermal band, the cloud cover percentage is not easily calculated. Where a calculation does not occur the cloud cover percentage is set to -1%.

At the bottom of the Scene Information Box, there is line for Sensor/Product. Although, the title changes it effectively displays similar information. If the sensor/product line includes TM, ETM+ or OLI-TIRS, meaning Thematic Mapper, Enhanced Thematic Mapper Plus or Operational Land Imager-Thermal InfraRed Sensor respectively, the cloud cover will usually be calculated as all these sensors have a thermal band. Whereas, if the sensor/product is MSS, then the cloud cover percentage will be -1%.

Landsat 8 uses the OLI-TIRS sensor, Landsat 7 has the ETM+ sensor, whereas Landsat’s 4 & 5 have both TM and MSS sensors, and Landsat’s 1, 2 & 3 only have MSS.

Quirk Three: What are all the other files alongside the LandsatLook Natural Colour Image?
When you select an image from Landsat, you’re given all available Landsat products associated with it. The most common additional products you’ll be offered are:

  • LandsatLook Thermal Image – This is usually a jpeg of the thermal band, which shows the variations in temperature, where the darker areas are colder, and the lighter areas are warmer.
  • LandsatLook Quality Image – Currently only available with Landsat 8, and is a jpeg which shows the positions of the clouds and other features such as snow and ice on your image.
  • LandsatLook Images with Geographic Reference – These are a series of compressed data files which can be uploaded into a Geographical Information System, allowing the application of image processing techniques. These are big files compressed, an even bigger uncompressed, and so you need a lot of storage space if you start downloading them!

Quirk Four: Why do some Landsat 7 images have black stripes on them?

South West England from the 8th December 2014, showing black stripes.  Landsat 7 imagery courtesy of USGS/NASA.

South West England from the 8th December 2014, showing black stripes.
Landsat 7 imagery courtesy of USGS/NASA.

This is due to the failure of Landsat 7’s Scan Line Corrector on the 31st May 2003. The Scan Line Corrector’s role is to compensate for the forward movement of the satellite as it orbits, and the failure means instead of mapping in straight lines, a zigzag ground track is followed. This causes parts of the edge of the image not to be mapped; hence giving you the black stripe effect – it can be seen clearly to the right with a zoomed in version of the image at the top of the blog. The location of the black stripes varies, and each stripe represents between 390 – 450m of the image; therefore US Geological Survey (USGS) estimates that affected images lose about 22% of their data.

The centre of the image can still be used, however it’s more complicated to use Landsat 7 data after May 2003. It’s worth noting that on the sensor/product line in the Scene Information Box, it uses the notation SLC-off to indicate that the image was taken after the Scan Line Corrector failed.

Quirk Five: My image has brightly coloured single pixels

Landsat 5 MSS image acquired on 16 January 1997 via ESA receiving station. Image courtesy of USGS/NASA/ESA.

Landsat 5 MSS image acquired on 16 January 1997 via ESA receiving station. Image courtesy of USGS/NASA/ESA.

Brightly coloured single pixels that don’t match the surrounding area, is phenomena known as Impulse Noise; which is also seen with dark or missing pixels. An example of an image with this phenomena is shown on the right. Technical issues during the downlink from the satellite or during the transcription from tape to digital media are the most frequent causes. However, small fires on the ground can also show up as bright pixels that cause the same effect, although these are less frequent. As Landsat has a 30m spatial resolution, these aren’t campfires or barbecues; but are high temperature features such as brush burning, wildfires or gas flares.

Images heavily affected by Impulse Noise aren’t released into the USGS archive. Also it’s only visible when zoomed it, and selecting another image from a different date will mostly likely cure the phenomena.

We hope this quintet of quirks has explained some of the queries and questions you might have about using Landsat data, and if you’ve not come across any of these yet this should give you a heads up for when you do come across them.

Mastering Landsat Images in 5 Simple Steps!

Landsat 8 image of South West England from the 25th July 2014. Landsat imagery courtesy of NASA Goddard Space Flight Center and U.S. Geological Survey

Landsat 8 image of South West England from the 25th July 2014. Landsat imagery courtesy of NASA Goddard Space Flight Center and U.S. Geological Survey

Always wanted to use satellite imagery, but weren’t sure where to start? This blog shows you the five simple steps to find, download and view free imagery from the United States Geological Survey (USGS) Landsat satellite. Within fifteen minutes of reading this post you could have images from Landsat’s 40 year global archive on your computer, like the one at the top of this blog of Plymouth Hoe from the 25th July 2014. So what are we waiting for, let’s get started …

Step One: Register!
Register for a user account with the USGS who, along with NASA, manages the Landsat data archive. It’s free to create an account, although you will need an email address and answer a quick few questions to help USGS assess their users. Once the account is activated, you’re ready to go and you can download as much data as you need.

Step Two: Selecting your data download tool
USGS offers three tools for downloading data: Landsat LookViewer, Global Visualisation Viewer (GloVis) and EarthExplorer. Whilst all three offer options to view Landsat data, we’d suggest you use GloVis as it’s the easiest tool for new users to navigate. GloVis has a main screen and a left sidebar; the sidebar controls which Landsat image is displayed in main screen.

Step Three: Selecting the image
At the top of the sidebar is a map centred on the US, and the red dot indicates the position of the displayed image. To choose another location use the map’s scroll bars to wander the world, and simply click on the area you want to see. The four arrow buttons on the sidebar allow you to fine-tune the precise location.

Finally, select the month and year you’re interested in, and the Landsat image that most closely matches your selection will appear in the main window. As Landsat is an optical sensor, it cannot see through clouds. If the chosen image has clouds obscuring the view, use the Previous Scene and Next Scene buttons to move easily around the closet images to your preferred date.

It is worth noting, the Max Cloud dropdown option, which allows you to choose the maximum percentage of the image you are willing to have covered by cloud. For example, if you select 40%, GloVis will only give you images that have 40% or less cloud coverage.

Step Four: Downloading the Landsat image
Once you have an image you like, simply click on Add at the bottom of the sidebar, and then click Send to Cart. This will take you to the download screen.

Your image will have entity ID, which was also visible in the Scene Information Box on the previous screen, consisting of 21 characters such as LC82040252014206LGN00, where:

  • The first three characters describe the Landsat satellite the image is from and LC8 refers to Landsat 8.
  • The next six (204025) are a Landsat catalogue number known as the Worldwide Reference Systems. If you remember the numbers for your area of interest, entering them in GloVis can be a quick way of navigating to that location.
  • The following seven characters give the year (2014) and the day of year (206) the image was taken; the day of the year is a numerical count starting with 001 on 1st January, and so 206 is 25th July.
  • A three-digit ground station identifier is next, in this case LGN indicates that the USGS Landsat Ground Network received this data.
  • Finally, the last two-digits are a version number (00).

Clicking the download button, gives you options to download any of the Landsat products available for the image you’ve selected. The LandsatLook Natural Colour Image is a jpeg version of the image you were looking at in GloVis, and is the easiest one to use. Click on download and the image you’ve chosen will be downloaded to your computer.

Step Five: Viewing, and using, the Landsat image

Plymouth Sound on 25th July 2014 from Landsat 8: Image courtesy of USGS/NASA Landsat

Plymouth Sound on 25th July 2014 from Landsat 8: Image courtesy of USGS/NASA Landsat

The easiest way to view the image is to use the Windows Photo Viewer tool, where you will be able to see the image and zoom in and out of it. You can also open the image in Windows Paint, and use its basic tools to resize and crop the image. For example, the image on the right is a zoomed in version of the image at the top of this post.

Landsat images are free, and they carry no copyright; however, NASA does request you attribute them appropriately – “Landsat imagery courtesy of NASA Goddard Space Flight Center and U.S. Geological Survey” or “USGS/NASA Landsat” – which means you, can use Landsat images on your website or other materials. The full information on Landsat copyright can be found here.

Next week, we’ll talk more about the other products you can download from Landsat. We hope these five simple steps have inspired you to find, download and use some Landsat data.