Inspiring the Next Generation of EO Scientists

Artist's rendition of a satellite - 3dsculptor/123RF Stock Photo

Artist’s rendition of a satellite – 3dsculptor/123RF Stock Photo

Last week, whilst Europe’s Earth Observation (EO) community was focussed on the successful launch of Sentinel-5P, over in America Tuesday 10th October was Earth Observation Day!

This annual event is co-ordinated by AmericaView, a non-profit organisation, whose aim to advance the widespread use of remote sensing data and technology through education and outreach, workforce development, applied research, and technology transfer to the public and private sectors.

Earth Observation Day is a Science, Technology, Engineering, and Mathematics (STEM) event celebrating the Landsat mission and its forty-five year archive of imagery. Using satellite imagery provides valuable experience for children in maths and sciences, together with introducing subjects such as land cover, food production, hydrology, habitats, local climate and spatial thinking. The AmericaView website contains a wealth of EO materials available for teachers to use, from fun puzzles and games through to a variety of remote sensing tutorials. Even more impressive is that the event links schools to local scientists in remote sensing and geospatial technologies. These scientists provide support to teachers including giving talks, helping design lessons or being available to answer student’s questions.

This is a fantastic event by AmericaView, supporting by wonderful resources and remote sensing specialists. We first wrote about this three years ago, and thought the UK would benefit from something similar. We still do. The UK Space Agency recently had an opportunity for organisations interested in providing education and outreach activities to support EO, satellite launch programme or the James Webb Space Telescope. It will be interesting to see what the successful candidates come up with.

At Pixalytics we’re passionate about educating and inspiring the next generation of EO scientists. For example, we regularly support the Remote Sensing and Photogrammetry Society’s Wavelength conference for students and early career scientists; and sponsored the Best Early-Career Researcher prize at this year’s GISRUK Conference. We’re also involved with two exciting events at Plymouth’s Marine Biological Association, a Young Marine Biologists (YMB) Summit for 12-18 year olds at the end of this month and their 2018 Postgraduate conference.

Why is this important?
The space industry, and the EO sector, is continuing to grow. According to Euroconsult’s ‘Satellites to Be Built & Launched by 2026 – I know this is another of the expensive reports we highlighted recently – there will be around 3,000 satellites with a mass above 50 kg launched in the next decade – of which around half are anticipated as being used for EO or communication purposes. This almost doubles the number of satellites launched in the last ten years and doesn’t include the increasing number of nano and cubesats going up.

Alongside the number of satellites, technological developments mean that the amount of EO data available is increasing almost exponentially. For example, earlier this month World View successfully completed multi-day flight of its Stratollite™ service, which uses high-altitude balloons coupled with the ability to steer within stratospheric winds. They can carry a variety of sensors, a mega-pixel camera was on the recent flight, offering an alternative vehicle for collecting EO data.

Therefore, we need a future EO workforce who are excited, and inspired, by the possibilities and who will take this data and do fantastic things with it.

To find that workforce we need to shout about our exciting industry and make sure everyone knows about the career opportunities available.

Can You See The Great Wall of China From Space?

Area north of Beijing, China, showing the Great Wall of China running through the centre. Image acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

Dating back over two thousand three hundred years, the Great Wall of China winds its way from east to west across the northern part of the country. The current remains were built during Ming Dynasty and have a length of 8 851.8 km according to 2009 work by the Chinese State Administration of Cultural Heritage and National Bureau of Surveying and Mapping Agency. However, if you take into account the different parts of the wall built by other dynasties, its length is almost twenty two thousand kilometres.

The average height of the wall is between six and seven metres, and its width is between four to five metres. This width would allow five horses, or ten men, to walk side by side. The sheer size of the structure has led people to believe that it could be seen from space. This was first described by William Stukeley in 1754, when he wrote in reference to Hadrian’s Wall that ‘This mighty wall of four score miles in length is only exceeded by the Chinese Wall, which makes a considerable figure upon the terrestrial globe, and may be discerned at the Moon.’

Despite Stukeley’s personal opinion not having any scientific basis, it has been repeated many times since. By the time humans began to go into space, it was considered a fact. Unfortunately, astronauts such as Buzz Aldrin, Chris Hatfield and even China’s first astronaut, Yang Liwei, have all confirmed that the Great Wall is not visible from space by the naked eye. Even Pixalytics has got a little involved in this debate. Two years ago we wrote a blog saying that we couldn’t see the wall on Landsat imagery as the spatial resolution was not small enough to be able to distinguish it from its surroundings.

Anyone who is familiar with the QI television series on the BBC will know that they occasionally ask the same question in different shows and give different answers when new information comes to light. This time it’s our turn!

Last week Sam was a speaker at the TEDx One Step Beyond event at the National Space Centre in Leicester – you’ll hear more of that in a week or two. However, in exploring some imagery for the event we looked for the Great Wall of China within Sentinel-2 imagery. And guess what? We found it! In the image at the top, the Great Wall can be seen cutting down the centre from the top left.

Screenshot of SNAP showing area north of Beijing, China. Data acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

It was difficult to spot. The first challenge was getting a cloud free image of northern China, and we only found one covering our area of interest north of Beijing! Despite Sentinel-2 having 10 m spatial resolution for its visible wavelengths, as noted above, the wall is generally narrower. This means it is difficult to see the actual wall itself, but it is possible to see its path on the image. This ability to see very small things from space by their influence on their surroundings is similar to how we are able to spot microscopic phytoplankton blooms. The image on the right is a screenshot from Sentinel Application Platform tool (SNAP) which shows the original Sentinel-2 image of China on the top left and the zoomed section identifying the wall.

So whilst the Great Wall of China might not be visible from space with the naked eye, it is visible from our artificial eyes in the skies, like Sentinel-2.

Brexit: Science & Space

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Brexit currently dominates UK politics. Whilst it’s clear the UK is leaving the European Union (EU) in March 2019, the practical impact, and consequences, are still a confused fog hanging over everything. The UK Government Department for Exiting the European Union has been issuing position papers to set out how it sees the UK’s future arrangements with the EU.

Last week, the ‘Collaboration in science and innovation: a future partnership paper’ was issued. Given our company’s focus we were eager to see what was planned. Unfortunately, like a lot of the UK Government pronouncements on Brexit, it is high on rhetoric, but low on any helpful, or new, information or clarity.

It begins with a positive, but perhaps rather obvious, statement, stating that one of the UK’s core objectives is to ‘seek agreement to continue to collaborate with European partners on major science, research and technology initiatives.’

Future Partnership with EU Principles
Key aspects of the UK’s ambition for the future partnership include:

  • Science & Innovation collaboration is not only maintained, but strengthened.
  • With its strong research community, the UK wants an ambitious agreement for continued research co-operation.
  • Government wants the UK to be a hub for international talent in research, and to welcome the brightest and best people from around the world.

The principles are followed by four particular areas the UK wants to discuss with the EU. Interestingly, it specifically outlines how non-EU countries currently participate in each of these areas, which are Research & Innovation Framework Programmes, Space Programmes, Nuclear R&D and Defence R&D.

Research & Innovation Framework Programmes
Horizon 2020 is highlighted as the UK ranks top across the EU in terms of contracts and participants in it. The Government confirms its commitment to underwriting any projects submitted whilst the UK is still an EU member.

Support for this programme is good, however with an end date of 2020 it is going to be equally important to be a strong partner of whatever research funding programme that is going to follow.

Space Programmes
As we have described before the European Space Agency is not an EU institution, and so is not impacted by Brexit – a fact reinforced by the paper. Three key EU, rather than ESA, led space programmes are highlighted:

  • Galileo Navigation and Positioning System – Issues here surround both the use of the system and its ongoing development. UK firms have been key suppliers for this work including Surrey Satellite Technology Ltd (SSTL), Qinetiq, CGI, Airbus and Scisys.
  • Copernicus – The Copernicus Earth Observation data is freely available to anyone in the world. The key element here is about being at the table to influence the direction. Although, the paper does refer to existing precedents for third party participation.
  • Space Surveillance and Tracking – this is a new programme.

The paper states that given the unique nature of space programmes, the ‘EU and UK should discuss all options for future cooperation including new arrangements.’

What Is Not Said
There are a lot of positive and welcome words here, but also a huge amount unsaid, for example:

  • Interconnectivity: Science and innovation happens when researchers work together, so the UK’s approach to the movement of people is fundamental. Will the brightest and best be allowed to come and work here, and will they want to?
  • Education: Education is fundamental to this area, yet it does not merit a single mention in the paper. New researchers and early career scientists benefit hugely from programmes such as Erasmus, will our involvement in these continue?
  • Financial Contribution: How much is the UK willing to pay to be part of science and innovation programmes? The paper notes any financial contribution will have to be weighed against other spending priorities. Not exactly hugely encouraging.
  • Contractual Issues: Part of the issue with Galileo is that the contracts specifically exclude non-EU countries from involvement.. Whilst, it is possible to see that the UK could negotiate use of Galileo, continued involvement as a supplier may be more difficult.

Conclusion
The UK wants dialogue with the EU on far-reaching science and innovation agreement. This ambition is to be applauded, but we are a very long way away from that point. We hope both parties are able to work together to get there.

Flip-Sides of Soil Moisture

Soil Moisture changes between 19th and 25th August around Houston, Texas due to rainfall from Hurricane Harvey. Courtesy of NASA Earth Observatory image by Joshua Stevens, using soil moisture data courtesy of JPL and the SMAP science team.

Soil moisture is an interesting measurement as it can be used to monitor two diametrically opposed conditions, namely floods and droughts. This was highlighted last week by maps produced from satellite data for the USA and Italy respectively. These caught our attention because soil moisture gets discussed on a daily basis in the office, due to its involvement in a project we’re working on in Uganda.

Soil moisture can have a variety of meanings depending on the context. For this blog we’re using soil moisture to describe the amount of water held in spaces between the soil in the top few centimetres of the ground. Data is collected by radar satellites which measure microwaves reflected or emitted by the Earth’s surface. The intensity of the signal depends on the amount of water in the soil, enabling a soil moisture content to be calculated.

Floods
You can’t have failed to notice the devastating floods that have occurred recently in South Asia – particularly India, Nepal and Bangladesh – and in the USA. The South Asia floods were caused by monsoon rains, whilst the floods in Texas emanated from Hurricane Harvey.

Soil moisture measurements can be used to show the change in soil saturation. NASA Earth Observatory produced the map at the top of the blogs shows the change in soil moisture between the 19th and 25th August around Houston, Texas. The data is based on measurements acquired by the Soil Moisture Active Passive (SMAP) satellite, which uses a radiometer to measure soil moisture in the top 5 centimetres of the ground with a spatial resolution of around 9 km. On the map itself the size of each of the hexagons shows how much the level of soil moisture changed and the colour represents how saturated the soil is.

These readings have identified that soil moisture levels got as high as 60% in the immediate aftermath of the rainfall, partly due to the ferocity of the rain, which prevented the water from seeping down into the soil and so it instead remained at the surface.

Soil moisture in Italy during early August 2017. The data were compiled by ESA’s Soil Moisture CCI project. Data couresy of ESA. Copyright: C3S/ECMWF/TU Wien/VanderSat/EODC/AWST/Soil Moisture CCI

Droughts
By contrast, Italy has been suffering a summer of drought and hot days. This year parts of the country have not seen rain for months and the temperature has regularly topped one hundred degrees Fahrenheit – Rome, which has seventy percent less rainfall than normal, is planning to reduce water pressure at night for conservation efforts.

This has obviously caused an impact on the ground, and again a soil moisture map has been produced which demonstrates this. This time the data was come from the ESA’s Soil Moisture Climate Change Initiative project using soil moisture data from a variety of satellite instruments. The dataset was developed by the Vienna University of Technology with the Dutch company VanderSat B.V.

The map shows the soil moisture levels in Italy from the early part of last month, with the more red the areas, the lower the soil moisture content.

Summary
Soil moisture is a fascinating measurement that can provide insights into ground conditions whether the rain is falling a little or a lot.

It plays an important role in the development of weather patterns and the production of precipitation, and is crucial to understanding both the water and carbon cycles that impact our weather and climate.

Optical Imagery is Eclipsed!

Solar eclipse across the USA captured by Suomi NPP VIIRS satellite on 21st August. Image courtesy of NASA/ NASA’s Earth Observatory.

Last week’s eclipse gave an excellent demonstration of the sun’s role in optical remote sensing. The image to the left was acquired on the 21st August by the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the NOAA/NASA Suomi NPP satellite, and the moon’s shadow can be clearly seen in the centre of the image.

Optical remote sensing images are the type most familiar to people as they use the visible spectrum and essentially show the world in a similar way to how the human eye sees it. The system works by a sensor aboard the satellite detecting sunlight reflected off the land or water – this process of light being scattered back towards the sensor by an object is known as reflectance.

Optical instruments collect data across a variety of spectral wavebands including those beyond human vision. However, the most common form of optical image is what is known as a pseudo true-colour composite which combines the red, green and blue wavelengths to produce an image which effectively matches human vision; i.e., in these images vegetation tends to be green, water blue and buildings grey. These are also referred to as RGB images.

These images are often enhanced by adjustments to the colour pallets of each of the individual wavelengths that allow the colours to stand out more, so the vegetation is greener and the ocean bluer than in the original data captured by the satellite. The VIIRS image above is an enhanced pseudo true-colour composite and the difference between the land and the ocean is clearly visible as are the white clouds.

As we noted above, optical remote sensing works by taking the sunlight reflected from the land and water. Therefore during the eclipse the moon’s shadow means no sunlight reaches the Earth beneath, causing the circle of no reflectance (black) in the centre of the USA. This is also the reason why no optical imagery is produced at night.

This also explains why the nemesis of optical imagery is clouds! In cloudy conditions, the sunlight is reflected back to the sensor by the clouds and does not reach the land or water. In this case the satellite images simply show swirls of white!

Mosaic composite image of solar eclipse over the USA on the 21st August 2017 acquired by MODIS. .Image courtesy of NASA Earth Observatory images by Joshua Stevens and Jesse Allen, using MODIS data from the Land Atmosphere Near real-time Capability for EOS (LANCE) and EOSDIS/Rapid Response

A second eclipse image was produced from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite. Shown on the left this is a mosaic image from the 21st August, where:

  • The right third of the image shows the eastern United States at about 12:10 p.m. Eastern Time, before the eclipse had begun.
  • The middle part was captured at about 12:50 p.m. Central Time during the eclipse.
  • The left third of the image was collected at about 12:30 p.m. Pacific Time, after the eclipse had ended.

Again, the moon’s shadow is obvious from the black area on the image.

Hopefully, this gives you a bit of an insight into how optical imagery works and why you can’t get optical images at night, under cloudy conditions or during an eclipse!

Landsat Turns 45!

False colour image of Dallas, Texas. The first fully operational Landsat image taken on July 25, 1972, Image courtesy: NASA’s Earth Observatory

Landsat has celebrated forty-five years of Earth observation this week. The first Landsat mission was Earth Resources Technology Satellite 1 (ERTS-1), which was launched into a sun-synchronous near polar orbit on the 23 July 1972. It wasn’t renamed Landsat-1 until 1975. It had an anticipated life of 1 year and carried two instruments: the Multi Spectral Scanner (MSS) and the Return-Beam Vidicon (RBV).

The Landsat missions have data continuity at their heart, which has given a forty-five year archive of Earth observation imagery. However, as technological capabilities have developed the instruments on consecutive missions have improved. To demonstrate and celebrate this, NASA has produced a great video showing the changing coastal wetlands in Atchafalaya Bay, Louisiana, through the eyes of the different Landsat missions.

In total there have been eight further Landsat missions, but Landsat 6 failed to reach its designated orbit and never collected any data. The missions have been:

  • Landsat 1 launched on 23 July 1972.
  • Landsat 2 launched on 22 January 1975.
  • Landsat 3 was launched on 5 March 1978.
  • Landsat 4 launched on 16 July 1982.
  • Landsat 5 launched on 1 March 1984.
  • Landsat 7 launched on 15 April 1999, and is still active.
  • Landsat 8 launched on 11 February 2013, and is still active.

Landsat 9 is planned to be launched at the end 2020 and Landsat 10 is already being discussed.

Some of the key successes of the Landsat mission include:

  • Over 7 million scenes of the Earth’s surface.
  • Over 22 million scenes had been downloaded through the USGS-EROS website since 2008, when the data was made free-to-access, with the rate continuing to increase (Campbell 2015).
  • Economic value of just one year of Landsat data far exceeds the multi-year total cost of building, launching, and managing Landsat satellites and sensors.
  • Landsat 5 officially set a new Guinness World Records title for the ‘Longest-operating Earth observation satellite’ with its 28 years and 10 months of operation when it was decommissioned in December 2012.
  • ESA provides Landsat data downlinked via their own data receiving stations; the ESA dataset includes data collected over the open ocean, whereas USGS does not, and the data is processed using ESA’s own processor.

The journey hasn’t always been smooth. Although established by NASA, Landsat was transferred to the private sector under the management of NOAA in the early 1980’s, before returning to US Government control in 1992. There have also been technical issues, the failure of Landsat 6 described above; and Landsat 7 suffering a Scan Line Corrector failure on the 31st May 2003 which means that instead of mapping in straight lines, a zigzag ground track is followed. This causes parts of the edge of the image not to be mapped, giving a black stripe effect within these images; although the centre of the images is unaffected the data overall can still be used.

Landsat was certainly a game changer in the remote sensing and Earth observation industries, both in terms of the data continuity approach and the decision to make the data free to access. It has provided an unrivalled archive of the changing planet which has been invaluable to scientists, researchers, book-writers and businesses like Pixalytics.

We salute Landsat and wish it many more years!

AgriTech Seeds Start to Grow in Cornwall

On Monday I attended the Jump Start AgriTech event hosted by the South West Centre of Excellence in Satellite Applications at the Tremough Innovation Centre on the University of Exeter’s Penryn campus near Falmouth in Cornwall. As the name suggests the one day event covered innovations in AgriTech with a particular focus on what is, or could be, happening in the South West.

The day began with a series of short presentations and Paul Harris, Rothamsted Research, was up first on their Open Access Farm Platform. North Wyke Farm in Devon has been equipped with a variety of sensors and instruments to understand the effects of different farming practices. Of particular interest to me was their analysis of run-off, weather monitoring and soil moisture every 15 minutes; this is a great resource for satellite product validation.

I was up next talking about Earth Observation (EO) Satellite Data for AgriTech. Having seen people overpromise and oversell EO data too many times, I began with getting people to think about what they were trying to achieve, before looking at the technology. The circle of starting questions, on the right, is how I begin with potential clients. If satellite EO is the right technology from these answers, then you can start considering the combinations of both optical/microwave data and free-to-access and commercial data. I went on to show the different types of satellite imagery and what the difference in spatial resolution looks like within an agriculture setting.

I was followed by Vladimir Stolikovic, Satellite Applications Catapult, who focused on the Internet of Things and how it’s important to have sensor network data collected and communicated, with satellite broadband being used in conjunction with mobile phones and WiFi coverage.

Our last talk was by Dr Karen Anderson, University of Exeter, who looked at how drones can capture more than imagery. I was particularly intrigued by the ‘structure from motion photogrammetry’ technique which allows heights to be determined from multiple images; such that for a much lower cost, you can create something similar to what is acquired from a Lidar or laser scanning instrument. Also, by focusing on extracting height, data can be collected in conditions where there’s variable amounts of light, such as under clouds, and it doesn’t requirement high accuracy radiometric calibration.

After coffee, case studies were presented on farming applications:

  • VirtualVet – Collecting data on animal health and drug use digitally, via mobile apps, so paper records don’t become out of data and data can be collated to gain greater insights.
  • Steve Chapman, SC Nutrition Ltd, talked about improving milk production by making sure dried food is optimally prepared – large pieces of dried sweetcorn are digested less well, and a lower nutritional value is extracted from them.
  • The delightfully named, Farm Crap App from FoAM Kernow, aims to encourage farmers to spread manure rather than use artificial fertilizer. Farmers tended to go for the latter as it is easier to calculate the effects, and so having advice, regulations and the important calculations in a phone app, rather than in paper tables, should help them use manure.
  • Caterina Santachiara, ABACO, describing their siti4FARMER solution which is a cloud-computing based platform that includes data which scales from the field to farm and large land areas, with individual customisation so that users can easily see what they need to know.
  • Finally, Glyn Jones from AVANTI, talked about how farmers can stay connected to the internet, and tech support, while out in their fields. This sounds straightforward, but none of the current technologies work well enough – mainly due to the fact that fields aren’t flat! So a new technological area of investigation is ‘white space’ – these are frequencies allocated to broadcasting services, but left unused in particular geographical locations as buffers. The availability varies from location to location, but it is available to lower-powered devices.

After lunch, there were some presentations on Agritech funding opportunities from Innovate UK, AgriTech Cornwall and the South West Centre of Excellence in Satellite Applications. The day concluded with a facilitated session where small groups explored a variety of different ideas in more detail.

It was a really good day, and shows that there is real potential for AgriTech to grow in the South West.

Locusts & Monkeys

Soil moisture data from the SMOS satellite and the MODIS instrument acquired between July and October 2016 were used by isardSAT and CIRAD to create this map showing areas with favourable locust swarming conditions (in red) during the November 2016 outbreak. Data courtesy of ESA. Copyright : CIRAD, SMELLS consortium.

Spatial resolution is a key characteristic in remote sensing, as we’ve previously discussed. Often the view is that you need an object to be significantly larger than the resolution to be able to see it on an image. However, this is not always the case as often satellites can identify indicators of objects that are much smaller.

We’ve previously written about satellites identifying phytoplankton in algal blooms, and recently two interesting reports have described how satellites are being used to determine the presence of locusts and monkeys!

Locusts

Desert locusts are a type of grasshopper, and whilst individually they are harmless as a swarm they can cause huge damage to populations in their paths. Between 2003 and 2005 a swarm in West Africa affected eight million people, with reported losses of 100% for cereals, 90% for legumes and 85% for pasture.

Swarms occur when certain conditions are present; namely a drought, followed by rain and vegetation growth. ESA and the UN Food and Agriculture Organization (FAO) have being working together to determine if data from the Soil Moisture and Ocean Salinity (SMOS) satellite can be used to forecast these conditions. SMOS carries a Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) instrument – a 2D interferometric L-band radiometer with 69 antenna receivers distributed on a Y-shaped deployable antenna array. It observes the ‘brightness temperature’ of the Earth, which indicates the radiation emitted from planet’s surface. It has a temporal resolution of three days and a spatial resolution of around 50 km.

By combining the SMOS soil moisture observations with data from NASA’s MODIS instrument, the team were able to downscale SMOS to 1km spatial resolution and then use this data to create maps. This approach then predicted favourable locust swarming conditions approximately 70 days ahead of the November 2016 outbreak in Mauritania, giving the potential for an early warning system.

This is interesting for us as we’re currently using soil moisture data in a project to provide an early warning system for droughts and floods.

Monkeys

Earlier this month the paper, ‘Connecting Earth Observation to High-Throughput Biodiversity Data’, was published in the journal Nature Ecology and Evolution. It describes the work of scientists from the Universities of Leicester and East Anglia who have used satellite data to help identify monkey populations that have declined through hunting.

The team have used a variety of technologies and techniques to pull together indicators of monkey distribution, including:

  • Earth observation data to map roads and human settlements.
  • Automated recordings of animal sounds to determine what species are in the area.
  • Mosquitos have been caught and analysed to determine what they have been feeding on.

Combining these various datasets provides a huge amount of information, and can be used to identify areas where monkey populations are vulnerable.

These projects demonstrate an interesting capability of satellites, which is not always recognised and understood. By using satellites to monitor certain aspects of the planet, the data can be used to infer things happening on a much smaller scale than individual pixels.

World Oceans Day

Phytoplankton Bloom off South West England. Acquired by MODIS on 12th June 2003. Data courtesy of NASA.

June 8th is World Oceans Day. This is an annual global celebration of the oceans, their importance and how they can be protected for the future.

The idea of a World Ocean Day was originally proposed by the Canadian Government at the Earth Summit in Rio in 1992. In December 2008 a resolution was passed by United Nations General Assembly which officially declared that June 8th would be World Oceans Day. The annual celebration is co-ordinated by the Ocean Project organisation, and is growing from strength to strength with over 100 countries having participated last year.

There is a different theme each year and for 2017 it’s “Our Oceans, Our Future”, with a focus on preventing plastic pollution of the ocean and cleaning marine litter.

Why The Oceans Are Important?

  • The oceans cover over 71% of the planet and account for 96% of the water on Earth.
  • Half of all the oxygen in the atmosphere is released by phytoplankton through photosynthesis. Phytoplankton blooms are of huge interest to us at Pixalytics as despite their miniscule size, in large enough quantities, phytoplankton can be seen from space.
  • They help regulate climate by absorbing around 25% of the CO2 human activities release into the atmosphere.
  • Between 50% and 80% of all life on the planet is found in the oceans.
  • Less than 10% of the oceans have been explored by humans. More people have stood on the moon than the deepest point of the oceans – the Mariana Trench in the Pacific Ocean at around 11 km deep.
  • Fish accounted for about 17% of the global population’s intake of animal protein in 2013.

Why This Year’s Theme Is Important?

The pollution of the oceans by plastic is something which affects us all. From bags and containers washed up on beaches to the plastic filled garbage gyres that circulate within the Atlantic, Pacific and Indian Oceans, human activity is polluting the oceans with plastic and waste. The United Nations believe that as many as 51 trillion particles of microplastic are in the oceans, which is a huge environmental problem.

Everyone will have seen images of dolphins, turtles or birds either eating or being trapped by plastic waste. However, recently Dr Richard Kirby – a friend of Pixalytics – was able to film plastic microfibre being eaten by plankton. As plankton are, in turn, eaten by many marine creatures, this is one example of how waste plastic is entering the food chain. The video can seen here on a BBC report.

Dr Kirby also runs the Secchi Disk project which is a citizen science project to study phytoplankton across the globe and receives data from every ocean.

Get Involved With World Oceans Day

The world oceans are critical to the health of the planet and us! They help regulate climate, generate most of the oxygen we breathe and provide a variety of food and sources of medicines. So everyone should want to help protect and conserve these natural environments. They are a number of ways you can get involved:

  • Participate: There are events planned all across the world. You can have a look here and see if any are close to you.
  • Look: The Ocean Project website has a fantastic set of resources available.
  • Think: Can you reduce your use, or reliance on plastic?
  • Promote: Talk about World Oceans Day, Oceans and their importance.

Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.