Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Remote Sensing Goes Cold

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Remote sensing over the Polar Regions has poked its head above the ice recently.

On the 8th February The Cryosphere, a journal of the European Geosciences Union, published a paper by Smith et al titled ’Connected sub glacial lake drainage beneath Thwaites Glacier, West Antarctica’. It described how researchers used data from ESA’s CryoSat-2 satellite to look at lakes beneath a glacier.

This work is interesting from a remote sensing viewpoint as it is a repurposing of Cryosat-2’s mission. It’s main purpose is to measure the thickness of the ice sheets and marine ice cover using its Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter, known as SIRAL, and it can detect millimetre changes in the elevation of both ice-sheets and sea-ice.

The team were able to use this data to determine that the ice of the glacier had subsided by several metres as water had drained away from four lakes underneath. Whilst the whole process took place between June 2012 and January 2014, the majority of the drainage happened in a six month period. During this time it’s estimated that peak drainage was around 240 cubic metre per second, which is four times faster than the outflow of the River Thames into the North Sea.

We’ve previously highlighted that repurposing data – using data for more purposes than originally intended – is going to be one of the key future innovation trends for Earth Observation.

Last week, ESA also described how Sentinel-1 and Sentinel-2 data have been used over the last five months to monitor a crack in the ice near to the Halley VI research base of the British Antarctic Survey (BAS). The crack, known as Halloween Crack, is located on the Brunt ice Shelf in the Wedell Sea sector of Antarctica and was identified last October. The crack grew around 600 m per day during November and December, although it has since slowed to only one third of that daily growth.

Since last November Sentinel-2 has been acquiring optical images at each overflight, and this has been combined with SAR data from the two Sentinel-1 satellites. This SAR data will be critical during the Antarctic winter when there are only a few hours of daylight and a couple of weeks around mid-June when the sun does not rise.

This work hit the headlines as BAS decided to evacuate their base for the winter, due to the potential threat. The Halley VI base, which was only 17km from the crack, is the first Antarctic research station to be specifically designed to allow relocation to cope with this sort of movement in the ice shelf. It was already planned to move the base 23 km further inland, and this was successfully completed on the 2nd February. Further movement will depend on how the Halloween Crack develops over the winter.

Finally, the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) project was announced this week at the annual meeting of the American Association for the Advancement of Science. Professor Markus Rex outlined the project, which will sail a research vessel into the Arctic sea ice and let it get stuck so it can drift across the North Pole. The vessel will be filled with a variety of remote sensing in-situ instruments, and will aim to collect data on how the climate is changing in this part of the world through measuring the atmosphere-ice-ocean system.

These projects show that the Polar Regions have a lot of interest, and variety, for remote sensing.

Supporting Chimpanzee Conservation from Space

Gombe National Park, Tanzania. Acquired by Sentinel-2 in December 2016. Image courtesy of ESA.

Being able to visualise the changing face of the planet over time is one of the greatest strengths of satellite remote sensing. Our previous blog showed how Dubai’s coastline has evolved over a decade, and last week NASA described interesting work they’re doing on monitoring habitat loss for chimpanzees in conjunction with the Jane Goodall Institute.

Jane Goodall has spent over fifty years working to protect and conserve chimpanzees from the Gombe National Park in Tanzania, and formed the Jane Goodall Institute in 1977. The Institute works with local communities to provide sustainable conservation programmes.

A hundred years ago more than one million chimpanzees lived in Africa, today the World Wildlife Fund estimate the population may only be around 150,000 to 250,000. The decline is stark. For example, the Ivory Coast populations have declined by 90% within the last twenty years.

One of the key factors contributing to this decline is habitat loss, mostly through deforestation; although other factors such as hunting, disease and illegal capture also contributed.

Forests cover around 31% of the planet, and deforestation occurs when trees are removed and the land has another use instead of being a forest. In chimpanzee habitats, the deforestation is mostly due to logging, mining and drilling for oil. This change in land use can be monitored from space using remote sensing. Satellites produce regular images which can be used to monitor changes in the natural environment, in turn giving valuable information to conservation charities and other organisations.

In 2000 Lilian Pintea, from the Jane Goodall Institute, was shown Landsat images comparing the area around the Gombe National Park in 1972 and 1999. The latter image showed huge deforestation outside the park’s boundary. The Institute have continued to use Landsat imagery to monitor what is happening around the National Park. In 2009 they began a citizen science project with local communities giving them smartphones to report their observations. Combining these with ongoing satellite data from NASA has helped develop and implement local plans for land use and protection of the forests. Further visualisation of this work can be found here. The image at the top was acquired Sentinel-2 in December 2016 and shows the Gombe National Park, although it is under a little haze.

The satellite data supplied by NASA comes from the Landsat missions, which currently have an archive of almost forty-five years of satellite data, which is freely available to anyone. We also used Landsat for data in our Dubai animation last week. Landsat captures optical data, which means it operates in a similar manner to the human eye – although the instruments also have infrared capabilities. However, one drawback of optical instruments is that they cannot see through clouds. Therefore, whilst Landsat is great for monitoring land use when there are clear skies, it can be combined with synthetic aperture radar (SAR), from the microwave spectrum, as it can see through both clouds and smoke. This combination enables land use and land change to monitored anywhere in the world. Using the freely available Landsat and Sentinel-1 SAR data you could monitor what is happening to the forests in your neighbourhoods.

Satellite data is powerful tool for monitoring changes in the environment, and with the archive of data available offers a unique opportunity to see what has happened over the last four decades.

Remote Sensing: Learning, Learned & Rewritten

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

This blog post is about what I did and what thoughts came to mind on my three-month long ERASMUS+ internship at Pixalytics which began in July and ends this week.

During my first week at Pixalytics, after being introduced to the Plymouth Science Park buildings and the office, my first task was to get a basic understanding of what remote sensing is actually about. With the help of Sam and Andy’s book, Practical Handbook of Remote Sensing, that was pretty straightforward.

As the words suggest, remote sensing is the acquisition of data and information on an object without the need of being on the site. It is then possible to perform a variety of analysis and processing on this data to better understand and study physical, chemical and biological phenomena that affect the environment.

Examples of programming languages: C, Python & IDL

Examples of programming languages: C, Python & IDL

I soon realized that quite a lot of programming was involved in the analysis of satellite data. In my point of view, though, some of the scripts, written in IDL (Interactive Data Language), were not as fast and efficient as they could be, sometimes not at all. With that in mind, I decided to rewrite one of the scripts, turning it into a C program. This allowed me to get a deeper understanding of satellite datasets formats (e.g. HDF, Hierarchical Data Format) and improve my overall knowledge of remote sensing.

While IDL, a historic highly scientific language for remote sensing, provides a quick way of writing code, it has a number of glaring downsides. Poor memory management and complete lack of strictness often lead to scripts that will easily break. Also, it’s quite easy to write not-so-pretty and confusing spaghetti code, i.e., twisted and tangled code.

Writing C code, on the other hand, can get overly complicated and tedious for some tasks that would require just a few lines in IDL. While it gives the programmer almost full control of what’s going on, some times it’s just not worth the time and effort.

Instead, I chose to rewrite the scripts in Python which I found to be quite a good compromise. Indentation can sometimes be a bit annoying, and coming from other languages the syntax might seem unusual, but its great community and the large availability of modules to achieve your goals in just a few lines really make up for it.

It was soon time to switch to a bigger and more complex task, which has been, to this day, what I would call my “main task” during my time at Pixalytics: building an automated online processing website. The website aspect was relatively easy with a combination of the usual HTML, Javascript, PHP and CSS, it was rewriting and integrated the remote sensing scripts that was difficult. Finally all of those little, and sometimes not quite so little, scripts and programs were available from a convenient web interface, bringing much satisfaction and pride for all those hours of heavy thinking and brainstorming. Hopefully, you will read more about this development in the future from Pixalytics, as it will form the back-end of their product suite to be launched in the near future.

During my internship there was also time for events inside the Science Park such as the Hog Roast, and events outside as well when I participated at the South-West England QGIS User Group meeting in Dartmoor National Park. While it is not exactly about remote sensing, but more on the Geographic Information System (GIS) topic it made me realize how much I had learned on remote sensing in my short time at Pixalytics, I was able to exchange my opinions and points of view with other people that were keen on the subject.

A side project I’ve been working on in my final weeks was looking at the world to find stunning, interesting (and possibly both) places on Earth to make postcards from – such as one at the top of the blog. At times, programming and scientific research reads can get challenging and/or frustrating, and it’s so relaxing to just look at and enjoy the beauty of our planet.

It is something that anyone can do as it takes little knowledge about remote sensing. Free satellite imagery is available through a variety of sources; what I found to be quite easy to access and use was imagery from USGS/NASA Landsat-8 and ESA Sentinel-2. It is definitely something I would recommend.

Finally, I want to say “thank you” to Sam and Andy, without whom I would have never had the opportunity to get the most out of this experience, in a field in which I’ve always been interested into, but had never had the chance to actually get my hands on.

Blog written by Davide Mainas on an ERASMUS+ internship with Pixalytics via the Tellus Group.

Gathering of the UK Remote Sensing Clans

RSPSOC

The Remote Sensing & Photogrammetry Society (RSPSoc) 2016 Annual Conference is taking place this week, hosted by the University of Nottingham and the British Geological Society. Two Pixalytics staff, Dr Sam Lavender and Dr Louisa Reynolds, left Plymouth on a cold wet day on Monday, and arrived in the Nottinghamshire sunshine as befits RSPSoc week. The conference runs for three days and gives an opportunity to hear about new developments and research within remote sensing. Both Sam and Louisa are giving presentations this year.

Tuesday morning began with the opening keynote presentation given by Stephen Coulson of the European Space Agency (ESA), which discussed their comprehensive programme including the Copernicus and Earth Explorer missions. The Copernicus missions are generating ten times more data than similar previous missions, which presents logistical, processing and storage challenges for users. The future vision is to bring the user to the data, rather than the other way around. However, the benefits of cloud computing are still to be fully understood and ESA are interested in hearing about applications that couldn’t be produced with the IT technology we had 5 years ago.

After coffee Sam chaired the commercial session titled ‘The challenges (and rewards) of converting scientific research into commercial products.’ It started with three short viewpoint presentations from Jonathan Shears (Telespazio VEGA UK), Dr Sarah Johnson (University of Leicester) and Mark Jarman (Satellite Applications Catapult), and then moved into an interactive debate. It was great to see good attendance and a lively discussion ensued. Sam is planning to produce a white paper, with colleagues, based on the session. Some of the key points included:

  • Informative websites so people know what you do
  • Working with enthusiastic individuals as they will make sure something happens, and
  • To have a strong commercial business case alongside technical feasibility.
Dr Louisa Reynolds, Pixalytics Ltd, giving a presentation at RSPSoc 2016

Dr Louisa Reynolds, Pixalytics Ltd, giving a presentation at RSPSoc 2016

Louisa presented on Tuesday afternoon within the Hazards and Disaster Risk Reduction session. Her presentation was ‘A semi-automated flood mapping procedure using statistical SAR backscatter analysis’ which summarised the work Pixalytics has been doing over the last year on flood mapping which was funded by the Space for Smarter Government Programme (SSGP). Louisa was the third presenter who showed Sentinel-1 flood maps of York, and so it was a popular topic!

Alongside Louisa’s presentation, there have some fascinating other talks on topics as varied as:

  • Detecting and monitoring artisanal oil refining in the Niger Delta
  • Night time lidar reading of long-eroded gravestones
  • Photogrammatic maps of ancient water management features in Al-Jufra, Libya.
  • Seismic risk in Crete; and
  • Activities of Map Action

Although for Louisa her favourite part so far was watching a video of the launch of Sentinel 1A, through the Soyuz VS07 rocket’s discarding and deployment stages, simultaneously filmed from the craft and from the ground.

Just so you don’t think the whole event is about remote sensing, the conference also has a thriving social scene. On Monday there was a tour of The City Ground, legendary home of Nottingham Forest, by John McGovern who captained Forest to successive European Cup’s in 1979 and 1980. It was a great event and it was fascinating to hear about the irascible leadership style of Brian Clough. Tuesday’s event was a tour round the spooky Galleries of Justice Museum.

The society’s Annual General Meeting takes place on Wednesday morning; Sam’s presentation, ‘Monitoring Land Cover Dynamics: Bringing together Landsat-8 and Sentinel-2 data’, is in the Land Use/Land Cover Mapping session which follows.

The start of RSPSoc has been great as usual, offering chances to catch up with old remote sensing friends and meet some new ones. We are looking forward to rest of the conference and 2017!

Spinning Python in Green Spaces

2016 map of green spaces in Plymouth, using Sentinel-2 data courtesy of Copernicus/ESA.

2016 map of green spaces in Plymouth, using Sentinel-2 data courtesy of Copernicus/ESA.

As students, we are forever encouraged to find work experience to develop our real-life skills and enhance our CV’s. During the early period of my second year I was thinking about possible work experience for the following summer. Thanks to my University department, I was able to find the Space Placements in INdustry (SPIN) scheme. SPIN has been running for 4 years now, advertising short summer placements at host companies. These provide a basis for which students with degrees involving maths/physics/computer science can get an insight into the thriving space sector. I chose to apply to Pixalytics, and three months later they accepted my application in late March.

Fast forward a few more months and I was on the familiar train down to Plymouth in my home county of Devon. Regardless of your origin, living in a new place never fails to confuse, but with perseverance, I managed to settle in quickly. In the same way I could associate my own knowledge from my degree (such as atmospheric physics, and statistics) to the subject of remote sensing, a topic which I had not previously learnt about. Within a few days I was at work on my own projects learning more on the way.

My first task was an informal investigation into Open data that Plymouth City Council (PCC) has recently uploaded onto the web. PCC are looking for ways to create and support innovative business ideas that could potentially use open data. Given their background, Pixalytics could see the potential in developing this. I used the PCC’s green space, nature reserve and neighbourhood open data sets and found a way to calculate areas of green space in Plymouth using Landsat/Sentinel 2 satellite data to provide a comparison.

Sentinel-2 Image of Plymouth from 2016. Data courtesy of Copernicus/ESA.

Sentinel-2 Image of Plymouth from 2016. Data courtesy of Copernicus/ESA.

There were a few challenges to overcome in using the multiple PCC data sets as they had different coordinate reference systems, which needed to be consistent to be used in GIS software. For example, the Nature Reserves data set was partly in WGS84 and partly in OSGB 1936. Green space is in WGS 84 and the neighbourhood boundaries are in OSGB 1936. This meant that after importing these data sets in GIS software, they wouldn’t line up. Also, the green space data set didn’t include landmarks such as the disused Plymouth City airport, and large areas around Derriford Hospital and Ernsettle. Using GIS software I then went on to find a way to classify and calculate areas of green space within the Plymouth city boundary. The Sentinel-2 which can be seen above, has a higher spatial resolution and allowed me to include front and back gardens.

My green space map for 2016 created from Sentinel 2 data is the most accurate, and gives a total area of green space within the Plymouth neighbourhood boundary of 43 square kilometres, compared with 28 square kilometres that PCC have designated within their dataset. There are some obvious explainable differences, but it would be interesting to explore this deeper.

My second project was to write computer code for the processing and mosaicking of Landsat Imagery. Pixalytics is developing products where the user can select an area of interest from a global map, and these can cause difficult if the area crosses multiple images. My work was to make these images as continuous as possible, accounting for the differences in radiances.

I ended up developing a Python package, some of whose functions include obtaining the WRS path and row from an inputted Latitude and Longitude, correcting for the difference in radiances, and clipping and merging multiple images. There is also code that helps reduce the visual impact of clouds on individual images by using the quality band of the Landsat 8 product. This project took up most of my time, however I don’t think readers would appreciate, yet alone read a 500 line python script, so this has been left out.

I’d like to take this opportunity to thank Andrew and Samantha for giving me an insight into this niche, and potentially lucrative area of science as it has given me some direction and motivation for the last year of my degree. I hope I’ve provided some useful input to Pixalytics (even if it is just giving Samantha a very long winded Python lesson), because they certainly have done with me!

 

Blog written by:
Miles Lemmer, SPIN Summer Placement student.
BSc. Environmental Physics, University of Reading.

Rio Olympics from space

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

Rio de Janeiro, Brazil, acquired on the 13th July 2016. Image courtesy of Copernicus/ESA.

The Opening Ceremony of the 2016 Summer Olympics takes place on Friday and so we’ve decided to revive our highly infrequent blog series ‘Can you see sporting venues from space?’ Previously we’ve looked for the Singapore and Abu Dhabi Formula One Grand Prix Circuits, but this week we’re focussing on the Rio Olympic venues.

Rio de Janeiro
The Games of the XXXI Olympiad will take place from the 5th to the 21st August in the Brazilian city of Rio de Janeiro. It is expected that more than ten thousand athletes will be competing for the 306 Olympic titles across 37 venues, 7 of which are temporary venues and 5 are outside Rio. The remaining twenty-five are permanent venues within the city, and 11 have been newly built for the Olympics and Paralympics. It is these permanent venues that we’ll see if we can spot from space!

The image at the top of the blog shows the Rio area, and you’ll notice the dark green area in the centre of the image which is the Tijuca National Park containing one of the world’s largest urban rainforest. It covers an area of 32 km².

Spatial Resolution
Spatial resolution is the key characteristic in whether sporting venues can be seen from space, and in simplistic terms it refers to the smallest object that can be seen on Earth from that sensor. For example, an instrument with a 10 m spatial resolution means that each pixel on its image represents 10 m, and therefore for something to be distinguishable on that image it needs to be larger than 10 m in size. There are exceptions to this rule, such as gas flares, which are so bright that they can dominate a much larger pixel.

We used the phrase ‘simplistic terms’ above because technically, the sensor in the satellite doesn’t actually see a square pixel, instead it sees an ellipse due to the angle through which it receives the signal. The ellipses are turned into square pixels by data processing to create the image. Spatial resolution is generally considered to have four categories:

  • Low spatial resolution: tend to have pixels between 50 m and 1 km.
  • Medium spatial resolution: tend to have pixels between 4 m and 50 m.
  • High spatial resolution: tend to have pixels between 1 m and 4 m.
  • Very high spatial resolution: tend to have pixels between 0.25 m to 1 m

Clearly with very high resolution imagery, such as that provided by commercial Worldview satellites owned by DigitalGlobe, can provide great images of the Olympic venues. However, as you know we like to work with data that is free-to-access, rather than paid for data. We’ve used Sentinel-2 data for this blog, which has a 10 m spatial resolution for its visible and near infra-red bands via the multispectral imager it carries.

Can we see the Olympic venues from space?
In our earlier parts of this infrequent series we couldn’t see the night race from the Singapore circuit, but we did identify the Abu Dhabi track and red roof of the Ferrari World theme park. So can we see the Olympics? Actually we can!

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

On the image to left, you’ll notice two bright white circles, one in the middle of the image and the second to the south-east. The bright circle in the middle is the Olympic Stadium which will be hosting the athletics and stands out clearly from the buildings surrounding it, to the South East is the Maracanã Stadium which will stage the opening and closing ceremonies together with the finals of the football tournaments.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the bottom left of the image is small triangular shape which is location for the Aquatics Stadium, Olympic Tennis Centre, the Gymnastic and Wheelchair basketball arena, and the Carioca arenas which will host basketball, judo, wrestling and boccia. The bottom of the triangle juts out into the Jacarepagua Lagoon.

Image courtesy of Copernicus/ESA.

Image courtesy of Copernicus/ESA.

In the top left of the image, you can see the runway of the military Afonsos Air Force Base and north of the air base are a number of other Olympic venues, however these are hard to spot within their surroundings – these include the Equestrian Centre, Hockey Centre, BMX Centre, Whitewater canoe slalom course and the Deodoro stadium which will host the Rugby 7s and modern pentathlon.

It is possible to see the Olympic venues from space! Good luck to all the athletics competing over the next few weeks.

Living Planet Is Really Buzzing!

Living planet rotating global in the exhibition area, photo: S Lavender

Living planet rotating global in the exhibition area, photo: S Lavender

This week I’m at the 2016 European Space Agency’s Living Planet Symposium taking place in sunny Prague. I didn’t arrive until lunchtime on Monday and with the event already underway I hurried to the venue. First port of call was the European Association of Remote Sensing Companies (EARSC) stand as we’ve got copies of flyers and leaflets on their stand. Why not pop along and have look!

The current excitement and interest in Earth observation (EO) was obvious when I made my way towards the final sessions of the day. The Sentinel-2 and Landsat-8 synergy presentations were packed out, all seats taken and people were crowding the door to watch!

I started with the Thematic Exploitation Platforms session. For a long time the remote sensing community has wanted more data, and now we’re receiving it in ever larger quantities e.g., the current Copernicus missions are generating terabytes of data daily. With the storage requirements this generates there is a lot of interest in the use of online platforms to hold data, and then you upload your code to it, or use tools provided by the platform, rather than everyone trying to download their own individual copies. It was interesting to compare and contrast the approaches taken with hydrology, polar, coastal, forestry and urban EO data.

Tuesday was always going to be my busiest day of the Symposium as I was chairing two sessions and giving a presentation. I had an early start as the 0800 session on Coastal Zones I was co-chairing alongside Bob Brewin –a former PhD student of mine! It was great to see people presenting their results using Sentinel-2. The spatial resolution, 10m for the highest resolution wavebands, allows us to see the detail of suspended sediment resuspension events and the 705 nm waveband can be used for phytoplankton; but we’d still like an ocean colour sensor at this spatial resolution!

In the afternoon I headed into European Climate Data Records, where there was an interesting presentation on a long time-series AVHRR above-land aerosol dataset where the AVHRR data is being vicariously calibrated using the SeaWiFS ocean colour sensor. Great to see innovation within the industry where sensors launched one set of applications can be reused in others. One thing that was emphasised by presenters in both this session, and the Coastal Zone one earlier, was the need to reprocess datasets to create improved data records.

My last session of the day was on Virtual Research, where I was both co-chairing and presenting. It returned to the theme of handling large datasets, and the presentations focused on building resources that make using EO data easier. This ranged from bringing in-situ and EO data together by standardising the formatting and metadata of the in-situ data, through community datasets for algorithm performance evaluation, to data cubes that bring all the data needed to answer specific questions together into a three- (or higher) dimensional array that means you don’t spend all your time trying to read different datasets versus ask questions of them. My own presentation focused on our involvement with the ESA funded E-Collaboration for Earth Observation (E-CEO) project, which developed a collaborative platform  where challenges can be initiated and evaluated; allowing participants to upload their code and have it evaluated against a range of metrics. We’d run an example challenge focused on the comparison of atmospheric correction processors for ocean colour data that, once setup, could easily be rerun.

I’ve already realised that there too many interesting parallel sessions here, as I missed the ocean colour presentations which I’ve heard were great. The good news for me is that these sessions were recorded. So if you haven’t be able to make to Prague in person, or like me you are here but haven’t seen everything you wanted there are going to be selection of sessions to view on ESA’s site, for example, you can see the opening session here.

Not only do events like this gives you to a fantastic chance learn about what’s happening across the EO community, but they also give you the opportunity to catch up with old friends. I am looking forward to the rest of the week!