Pixalytics Four Year Celebration!

Sutichak Yachaingham / 123 Stock Photo

Sutichak Yachaingham / 123 Stock Photo

The start of June marked the four-year anniversary of Pixalytics! We’d not realised that the time of year had come around again until Sam started receiving messages via her LinkedIn profile. A lot of small business owners are like us, busy working with their head down and they forget to look up and celebrate their successes and milestones.

So, although we had to be prompted to look up, we’re going to celebrate our milestone of Pixalytics thriving – or maybe surviving – for four years!

The last twelve months have been really successful for us, with the main highlights:

  • Doubling our company turnover.
  • Appointing our first additional full-time employee.
  • Having our book, Practical Handbook of Remote Sensing, published and being sold.
  • Winning a Space for Smarter Government Programme contract.
  • Expanding our EO products and services into AgriTech & flood mapping.
  • Being short-listed for the Plymouth Herald Small Business of the Year Award.
  • Being short-listed for the European Association of Remote Sensing Companies (EARSC) European EO Services Company Award.
  • Hosting two ERASMUS placements and other work experience students.

We wrote a blog last June identifying what we were hoping to achieve in the coming twelve months. The key things were developing our customer base, products, and services together with employing someone else full time. Those aims were definitely achieved!

Well, that’s enough of the celebrating! Like any other small business we’re much more interested in what’s in our future, than our past. We’ve still got plenty of challenges ahead:

  • Doubling our turnover was a big leap, and this year we’ve got to maintain that level and ideally grow more.
  • Despite having additional hands in the business, we still have more ideas than capacity. Some of the ideas we had last year have been taken forward by other companies, before we’ve had the chance to get around to them! We wish them success and will be watching with interest to see how they develop.
  • Marketing is hard work. None of us at Pixalytics are marketing experts, and it’s clear to us the difficulty of competing with firms who have sales and marketing teams promoting themselves at conferences and events. Our current approach is a combination of social media, and picking the events to attend. Both Sam and I are promoting Pixalytics this week, and then it’s back to the office next week to welcome our summer Space Placements in Industry (SPIN) student.

Our key target for the end of this year is to release an innovate series of automated Earth observation products and services that we can sell to clients across the world – we started to describe this journey here. We know we’ll be competing with companies much bigger than us and we know it’s not going to be easy, and to revisit the Samuel Beckett quote we used last year:

Ever tried. Ever failed. No matter. Try Again. Fail again. Fail better.

It still holds true for how we run our company. We try things. We fail. We succeed. We learn. We try new things.

We’re looking forward to what the next twelve months, or four years, have in store.

Uncovering Secrets with Remote Sensing

Artist's rendition of a satellite - mechanik/123RF Stock Photo

Artist’s rendition of a satellite – mechanik/123RF Stock Photo

Recent significant discoveries in Cambodia and Jordan have highlighted the potential offered by remote sensing and satellite imagery to help uncover secrets on Earth – a field known as satellite archaeology.

Cambodia
Helicopter mounted Lidar was used to reveal multiple cities beneath the forest floor near the ancient temples of Angkor Wat in Cambodia. Lidar, which stands for Light Detection and Ranging, is an active optical remote sensing technique that uses a laser scanner to map the Earth’s topography by emitting a laser pulse and then receiving the backscattered signal. In Cambodia, a topographic Lidar with a near infrared laser was used by Australian archaeologist Dr Damian Evans to survey beneath the forest vegetation.

The conurbations discovered, surrounding the stone temple Preah Khan Kompong Svay, are believed to be between 900 to 1 400 years old. Analysis of the survey has shown a large number of homes packed together liked terraced houses, together with structures for managing water and geometric patterns formed from earth embankments – which could be gardens.

At 734 square miles, the 2015 survey is also thought to be the most extensive of its type ever undertaken. Dr Evans work is due to be published in the Journal of Archaeological Science.

Jordan
Archaeologists using high resolution satellite imagery, drones surveys and imagery within Google Earth have discovered a huge structure buried in the sand less than a kilometre south of the city of Petra. The two high resolution satellites used were Worldview-1 and Worldview-2, operated by DigitalGlobe. Worldview-1 was launched in September 2007 and has a half-metre panchromatic resolution; Worldview-2, launched two years later, offers similar panchromatic resolution and 1.85m multispectral resolution.

The outline of the structure measures 56m x 49m, and there is a smaller platform contained inside the larger one. Nearby pottery finds suggest the platform is 2 150 years old, and it is thought that it had a ceremonial purpose. The research undertaken by Sarah Parcak and Christopher Tuttle was published in the May 2016 edition of the Bulletin of the American Schools of Oriental Research.

Benefits of Remote Sensing & Satellites
Angkor Wat and Petra are both World Heritage sites, and the benefits of using remote sensing and satellite technology to undertake archaeological investigations are evident in the statement from Christopher Tuttle who noted that they did not intend to excavate their Petra discovery as ‘The moment you uncover something, it starts to disintegrate.’

Satellite technology allows investigations to take place without disturbing a piece of soil or grain of sand, which is a huge benefit in terms of time, cost and preservation with archaeology. These two discoveries also demonstrate that the world still has secrets to reveal. As Sarah Parcak herself said in 2013, “We’ve only discovered a fraction of one percent of archaeological sites all over the world.”

Who knows what remote sensing and satellite imagery will uncover in the future?

Monitoring ocean acidification from space

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

Enhanced pseudo-true colour composite of the United Kingdom showing coccolithophore blooms in light blue. Image acquired by MODIS-Aqua on 24th May 2016. Data courtesy of NASA.

What is ocean acidification?
Since the industrial revolution the oceans have absorbed approximately 50% of the CO2 produced by human activities (The Royal Society, 2005). Scientists previously saw this oceanic absorption as advantageous, however ocean observations in recent decades have shown it has caused a profound change in the ocean chemistry – resulting in ocean acidification (OA); as CO2 dissolves into the oceans it forms carbonic acid, lowering the pH and moving the oceans into a more acidic state. According to the National Oceanic Atmospheric Administration (NOAA) ocean pH has already decreased by about 30% and some studies suggest that if no changes are made, by 2100, ocean pH will decrease by 150%.

Impacts of OA
It’s anticipated OA will impact many marine species. For example, it’s expected it will have a harmful effect on some calcifying species such as corals, oysters, crustaceans, and calcareous plankton e.g. coccolithophores.

OA can significantly reduce the ability of reef-building corals to produce their skeletons and can cause the dissolution of oyster’s and crustacean’s protective shells, making them more susceptible to predation and death. This in turn would affect the entire food web, the wider environment and would have many socio-economic impacts.

Calcifying phytoplankton, such as coccolithophores, are thought to be especially vulnerable to OA. They are the most abundant type of calcifying phytoplankton in the ocean, and are important for the global biogeochemical cycling of carbon and are the base of many marine food webs. It’s projected that OA may disrupt the formation and/or dissolution of coccolithophores, calcium carbonate (CaCO3) shells, impacting future populations. Thus, changes in their abundance due to OA could have far-reaching effects.

Unlike other phytoplankton, coccolithophores are highly effective light scatterers relative to their surroundings due to their production of highly reflective calcium carbonate plates. This allows them to be easily seen on satellite imagery. The figure at the top of this page shows multiple coccolithophore blooms, in light blue, off the coast of the United Kingdom on 24th March 2016.

Current OA monitoring methods
Presently, the monitoring of OA and its effects are predominantly carried out by in situ observations from ships and moorings using buoys and wave gliders for example. Although vital, in situ data is notoriously spatially sparse as it is difficult to take measurements in certain areas of the world, especially in hostile regions (e.g. Polar Oceans). On their own they do not provide a comprehensive and cost-effective way to monitor OA globally. Consequently, this has driven the development of satellite-based sensors.

How can OA be monitored from space?
Although it is difficult to directly monitor changes in ocean pH using remote sensing, satellites can measure sea surface temperature and salinity (SST & SSS) and surface chlorophyll-a, from which ocean pH can be estimated using empirical relationships derived from in situ data. Although surface measurements may not be representative of deeper biological processes, surface observations are important for OA because the change in pH occurs at the surface first.

In 2015 researchers at the University of Exeter, UK became the first scientists to use remote sensing to develop a worldwide map of the ocean’s acidity using satellite imagery from the European Space Agency’s Soil Moisture and Ocean Salinity (SMOS) satellite that was launched in 2009 and NASA’s Aquarius satellite that was launched in 2011; both are still currently in operation. Thermal mounted sensors on the satellites measure the SST while the microwave sensors measure SSS; there are also microwave SST sensors, but they have a coarse spatial resolution.

Future Opportunities – The Copernicus Program
The European Union’s Copernicus Programme is in the process of launching a series of satellites, known as Sentinel satellites, which will improve understanding of large scale global dynamics and climate change. Of all the Sentinel satellite types, Sentinels 2 and 3 are most appropriate for assessment of the marine carbonate system. The Sentinel-3 satellite was launched in February this year andwill be mainly focussing on ocean measurements, including SST, ocean colour and chlorophyll-a.

Overall, OA is a relatively new field of research, with most of the studies being conducted over the last decade. It’s certain that remote sensing will have an exciting and important role to play in the future monitoring of this issue and its effects on the marine environment.

Blog written by Charlie Leaman, BSc, University of Bath during work placement at Pixalytics.

Playboy Magazine & Remote Sensing

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972. Image Credit: NASA

Blue Marble image of the Earth taken by the crew of Apollo 17 on Dec. 7 1972.
Image Credit: NASA

Are you aware the role Playboy Magazine has had in the remote sensing and image processing industries? Anyone who has read a selection of image processing books or journals will probably recognise the Lena picture as a standard test image. If you don’t know the image, you can find it here. Lena’s history is interesting.

It began in 1973 when Alexander Sawchuk, who was then an assistant professor at the USC Signal and Image Processing Institute, was part of a small team searching for a human face to scan for a colleague’s conference paper. They wanted a glossy image to get a good output dynamic range and during the search someone walked in with the November 1972 issue of Playboy. They used the centrefold image, the Swedish model Lena Söderberg, so they could wrap it around the drum of their scanner. As they only needed a 512 x 512 image, they scanned the top 5.12 inches of the picture, creating a head shot rather than the original full nude centrefold.

From this beginning Lena, often called Lenna as this was the forename used in Playboy, has gone to be one of the most commonly used standard test images. There are a number of theories of why this is the case, including:

  • The image has a good variety of different textual elements, such as light and dark, fuzzy and sharp, detailed and flat.
  • The grayscale version contains all the middle grays.
  • She has a symmetrical face, making any errors easy to see.
  • The image processing community is predominantly male!

Most often the image is used for compression testing, but has also used been used in the analysis of a wide variety of other techniques such as the application of filtering for edge enhancement. Even as recently as three years ago a group of scientists from Singapore shrunk the Lena image down to the width of a human hair as a demonstration of nanotechnology printing.

The wide use of Lena eventually came to the notice of Playboy, after the magazine Optical Engineering put her on their front cover in 1991. The Playboy organisation then tried to assert their copyright, however the genie was out of that bottle given the sheer number of people using it. The following year Optical Engineering reached an agreement with Playboy to continue using the image for scientific research and education. The copyright issues are why we didn’t include the Lena image on the blog, although has been reported that Playboy now overlook the use of the Lena for image processing we decided not to risk it! Playboy did help in the search for Lena in 1997 which enabled her to make a public appearance at the 50th Annual Conference of the Society for Imaging Science in Technology. An article written by Jamie Hutchinson giving a more detailed version of the Lena story can be found here.

What’s interesting about Lena is that despite all the technological advancements in the last forty years, she is still used as a standard testing image. Contrast this with the famous Blue Marble image of the Earth taken around the same time by astronauts aboard Apollo 17. The 1972 Blue Marble is probably the most iconic picture of the Earth, and unlike Lena has inspired numerous later images. For example, NASA used the Terra satellite to produce a detailed true-colour image of the Earth in 2002 and then three years later surpassed it with a new image that had twice as much detail as the original. The latest NASA Blue Marble was issued last year, captured by the US DSCOVR Earth observation satellite.

Standard test images are important, but the image processing community should probably start to think about updating the ones we use. Anyone got any ideas?

Simplification in the Geospatial Industry

GEO Business 2016 at Business Design Centre, London.

GEO Business 2016 at Business Design Centre, London.

It’s May which means it’s GEO Business time at the Business Design Centre in London. Last year Pixalytics used this event to dip our collective toe into exhibiting, and this year we’ve decided to be the other side and are attending as participants. Louisa and I are here to catch up with what’s happening in the geospatial industry through the conference presentations, the workshop programme and visiting the exhibition stands.

I attended the first conference session which began with a keynote from Tom Cheesewright, Applied Futurist, which highlighted the importance of location in bringing together the physical and digital world. This led into a presentation from Ed Parsons, Geospatial Technologist from Google, which discussed the changing face of this industry. In particular, he discussed the importance of ensuring we simplify our interfaces so users don’t have to know the detail of how things work, and are only provided with relevant information they want.

Gary Gale from What3Words applies this simplification approach to positioning. In his presentation he argued that address based systems aren’t unique and coordinate systems aren’t easy for people to understand. Therefore, What3Words have proposed a naming system whereby every 3 metre square on the Earth, is referenced by just three words. For example, the Business Design Centre has a position of begins.pulse.status under this system.

A third presentation in this session was given by Prof. Gianvito Lanzolla, from Cass Business School, and discussed what business models may look like in the future. He explained that digitization leads to connectivity and reminded everyone that phones and cameras only converged in 2002. This change is now moving into data, where connected products are becoming increasingly important: with trust and speed being key attributes.

The panel debate discussed the importance of disruptors for driving innovation forward, and that markets mature over time so that only the best offerings remain. There were also thoughts on privacy as people are happy to provide locational information when they wanted a service to know where they are, but that future services need to focus on the location of the individual rather than their provided address.

This theme of simplification and ensuring that products are fit for purpose was picked up in the post-lunch session where John Taylor, from the Land Registry, described how the MapSearch product for deeds was developed. Instead of trying to develop a complex interface with all possible features, they started with a stripped down Minimum Viable Product. John highlighted the importance of discussing the solution with the users at every iteration, making sure the features included were wanted and would be used. This approach resulted in a 65% reduction in manual searches, which has reduced staff costs and saved money for customers as the manual search for deeds was charged, whilst MapSearch is available for free.

Walking around the exhibition provided a good opportunity to catch up with colleagues, and see what was trending. It was noticeable that instrumentation was accompanied by what felt like an increased percentage of stands linked to UAVs (or drones) and data analysis / web mapping companies.

As usual with conferences my head is buzzing with ideas and things to take back to Pixalytics. In a recent blog we discussed the start of our journey to develop our own products and services, and the themes of simplification and fit for purpose are certainly going to feed into our thinking!

Flooding Forecasting & Mapping

Sentinel-1 data for York overlaid in red with Pixalytics flood mapping layer based on Giustarini approach for the December 2015 flooding event. Data courtesy of ESA.

Sentinel-1 data for York overlaid in red with Pixalytics flood mapping layer based on Giustarini approach for the December 2015 flooding event. Data courtesy of ESA.

Media headlines this week have shouted that the UK is in for a sizzling summer with temperature in the nineties, coupled with potential flooding in August due to the La Niña weather process.

The headlines were based on the UK Met Office’s three month outlook for contingency planners. Unfortunately, when we looked at the information ourselves it didn’t exactly say what the media headlines claimed! The hot temperatures were just one of a number of potential scenarios for the summer. As any meteorologist will tell you, forecasting a few days ahead is difficult, forecasting a three months ahead is highly complex!

Certainly, La Niña is likely to have an influence. As we’ve previously written, this year has been influenced by a significant El Niño where there are warmer ocean temperatures in the Equatorial Pacific. La Niña is the opposite phase, with colder ocean temperatures in that region. For the UK this means there is a greater chance of summer storms, which would mean more rain and potential flooding. However, there are a lot of if’s!

At the moment our ears prick up with any mention of flooding, as Pixalytics has just completed a proof of concept project, in association with the Environment Agency, looking to improve operational flood water extent mapping information during flooding incidents.

The core of the project was to implement recent scientific research published by Matgen et al. (2011), Giustarini et al. (2013) and Greifeneder et al. (2014). So it was quite exciting to find out that Laura Guistarini was giving a presentation on flooding during the final day of last week’s ESA Living Planets Symposium in Prague – I wrote about the start of the Symposium in our previous blog.

Laura’s presentation, An Automatic SAR-Based Flood Mapping Algorithm Combining Hierarchical Tiling and Change Detection, was interesting as when we started to implement the research on Sentinel-1 data, we also came to the conclusion that the data needed to be split into tiles. It was great to hear Laura present, and I managed to pick her brains a little at the end of the session. At the top of the blog is a Sentinel-1 image of York, overlaid with a Pixalytics derived flood map in red for the December 2015 flooding based on the research published by Laura

The whole session on flooding, which took place on the last morning of the Symposium, was interesting. The presentations also included:

  • the use of CosmoSkyMed data for mapping floods in forested areas within Finland.
  • extending flood mapping to consider Sentinel-1 InSAR coherence and polarimetric information.
  • an intercomparison of the processing systems developed at DLR.
  • development of operational flood mapping in Norway.

It was useful to understand where others were making progress with Sentinel-1 data, and how different processing systems were operating. It was also interesting that several presenters showed findings, or made comments, related to the double bounce experienced when a radar signal is reflected off not just the ground, but another structure such as a building or tree. Again it is something we needed to consider as we were particularly looking at urban areas.

The case study of our flood mapping project was published last week on the Space for Smarter Government Programme website as they, via UK Space Agency, using the Small Business Research Initiative supported by Innovate UK, funded the project.

We are continuing with our research, with the aim of having our own flood mapping product later this year – although the news that August may have flooding means we might have to quicken our development pace!

Living Planet Is Really Buzzing!

Living planet rotating global in the exhibition area, photo: S Lavender

Living planet rotating global in the exhibition area, photo: S Lavender

This week I’m at the 2016 European Space Agency’s Living Planet Symposium taking place in sunny Prague. I didn’t arrive until lunchtime on Monday and with the event already underway I hurried to the venue. First port of call was the European Association of Remote Sensing Companies (EARSC) stand as we’ve got copies of flyers and leaflets on their stand. Why not pop along and have look!

The current excitement and interest in Earth observation (EO) was obvious when I made my way towards the final sessions of the day. The Sentinel-2 and Landsat-8 synergy presentations were packed out, all seats taken and people were crowding the door to watch!

I started with the Thematic Exploitation Platforms session. For a long time the remote sensing community has wanted more data, and now we’re receiving it in ever larger quantities e.g., the current Copernicus missions are generating terabytes of data daily. With the storage requirements this generates there is a lot of interest in the use of online platforms to hold data, and then you upload your code to it, or use tools provided by the platform, rather than everyone trying to download their own individual copies. It was interesting to compare and contrast the approaches taken with hydrology, polar, coastal, forestry and urban EO data.

Tuesday was always going to be my busiest day of the Symposium as I was chairing two sessions and giving a presentation. I had an early start as the 0800 session on Coastal Zones I was co-chairing alongside Bob Brewin –a former PhD student of mine! It was great to see people presenting their results using Sentinel-2. The spatial resolution, 10m for the highest resolution wavebands, allows us to see the detail of suspended sediment resuspension events and the 705 nm waveband can be used for phytoplankton; but we’d still like an ocean colour sensor at this spatial resolution!

In the afternoon I headed into European Climate Data Records, where there was an interesting presentation on a long time-series AVHRR above-land aerosol dataset where the AVHRR data is being vicariously calibrated using the SeaWiFS ocean colour sensor. Great to see innovation within the industry where sensors launched one set of applications can be reused in others. One thing that was emphasised by presenters in both this session, and the Coastal Zone one earlier, was the need to reprocess datasets to create improved data records.

My last session of the day was on Virtual Research, where I was both co-chairing and presenting. It returned to the theme of handling large datasets, and the presentations focused on building resources that make using EO data easier. This ranged from bringing in-situ and EO data together by standardising the formatting and metadata of the in-situ data, through community datasets for algorithm performance evaluation, to data cubes that bring all the data needed to answer specific questions together into a three- (or higher) dimensional array that means you don’t spend all your time trying to read different datasets versus ask questions of them. My own presentation focused on our involvement with the ESA funded E-Collaboration for Earth Observation (E-CEO) project, which developed a collaborative platform  where challenges can be initiated and evaluated; allowing participants to upload their code and have it evaluated against a range of metrics. We’d run an example challenge focused on the comparison of atmospheric correction processors for ocean colour data that, once setup, could easily be rerun.

I’ve already realised that there too many interesting parallel sessions here, as I missed the ocean colour presentations which I’ve heard were great. The good news for me is that these sessions were recorded. So if you haven’t be able to make to Prague in person, or like me you are here but haven’t seen everything you wanted there are going to be selection of sessions to view on ESA’s site, for example, you can see the opening session here.

Not only do events like this gives you to a fantastic chance learn about what’s happening across the EO community, but they also give you the opportunity to catch up with old friends. I am looking forward to the rest of the week!

Is the remote sensing market an urban legend?

Yeti footprints

Yeti footprints on ice – erectus/123RF Stock Photo

The remote sensing/Earth observation (EO) market is like the Yeti or the Loch Ness Monster – there are plenty of people out there who tell you it exists, but very few companies have seen it with their own eyes!

We work in a fast growing and expanding industry, at least according to the myriad of reports that regularly drop into our inboxes. For example, over the last few weeks we’ve had press releases such as :-

With all this growth everything in the remote sensing/EO industry is fantastic, right? Well, no actually! Despite the report announcements, lots of companies within the industry are struggling to locate this valuable market.

Historically, a lot of funding was provided by governments and space agencies in the form of grants or tenders to promote the use, and uptake, of EO data, which enabled companies to develop and grow. Whilst such sources of funding are still available; the maturing of the industry coupled with the global economic slowdown is starting to constrict this revenue stream, forcing more and more EO companies out in the commercial world looking for the fabled billion dollar market. This development is currently being supported by venture capital as the growth forecasts are encouraging investment, but how many of these companies will be able to transition into profit making businesses?

The Holy Grail for everyone is a reliable, consistent and expanding market for EO products and services, something that few businesses in our sector have successfully found. There are a variety of reasons why the market feels like an urban legend, including:

  • Lack of knowledge on the products wanted leading to supplier led, rather than consumer led, product development.
  • Lack of an existing market meaning that EO companies need to work hard on advertising to tell possible customers they exist and the benefits they can offer.
  • Monopolistic behaviour of governments/space agencies. These bodies have spent large sums to launch satellites and need to demonstrate value for money. For example, the European Commission’s Copernicus Programme recently announced its intention to develop agriculture products from Sentinel data. Rather than developing the market, this could potentially destroy the market for existing EO companies.

It’s clear that to get proof of a remote sensing/EO market, companies need to develop value for money products that customers want, demonstrate the benefits of satellite data as an information source and stand out from the other legend hunters!

Here at Pixalytics we’re in the process of packing our data, securing our satellite links and checking our geo-referenced maps, ready to set out onto our journey in search of the fabled market. To date, our businesses has focussed on bespoke specialised products for individual customers and now we’re also hoping to develop more standard products that can be processed on demand, or made available from a pre-processed archive.

Of course we don’t have all the answers of where to find the customers, what the right products are or the best way of making letting people know we exist and we can help them. Although having seen the cost of these industry reports, we’re starting to think that writing, and selling, remote sensing/EO market reports is where the real money is!

Over the next few months, we’ll use this blog to tell you about our journey, the mistakes we make and what we learn. As we get a glimpses into the market we’ll put it up here, although it might be grainy and indistinguishable – but then aren’t all urban legend pictures!

Is This The Worst Global Coral Bleaching Event Ever?

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

It was announced last week that 93% of the Great Barrier Reef has been hit by coral bleaching due to rising sea temperatures from El Niño and climate change. We first wrote about the third worldwide coral bleaching in October 2015, noting this year’s event could be bad. Those fears would appear to be coming true with the results of Australia’s National Coral Bleaching Task Force aerial survey of 911 coral reefs which found 93% had suffered from bleaching; of which 55% had suffered severe bleaching.

Coral bleaching occurs when water stresses cause coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching.

Bleaching does not kill coral immediately, but puts them at a greater risk of mortality. Recovery is also possible if the water stress reduces and normal conditions return, which is what is hoped for in the Northern Sector of the reef above Port Douglas, where around 81% of corals had suffered severe bleaching – the water quality in this area is good, which should also aid recovery. The reefs fared better further south. Within the Central Sector, between Port Douglas and Mackay, 75 of the 226 reefs suffered from severe bleaching. Whilst in the Southern Sector below MacKay only 2 reefs suffered severe bleaching and 25% had no bleaching.

The news is not all bad. A survey of the coral reefs of the Andaman and Nicobar Islands, a territory of India that marks the dividing line between the Bay of Bengal & Andaman Sea, also published this week shows no evidence of coral bleaching. This survey is interesting for remote sensors as it was undertaken by a remotely operated vehicle, PROVe, developed by India’s National Institute of Ocean Technology. As well as mapping the coral reefs, PROVe has a radiometer attached and is measuring the spectral signatures of the coral in the area, which could be used to support the monitoring of corals from satellites.

Monitoring coral bleaching from space has been done before. For example, Envisat’s MERIS sensor was determined to be able to detect coral bleaching down to a depth of ten metres, or the Coral Bleaching Index (Ziskin et al, 2011) which uses the red, green and blue bands to measure increases in spectral reflectance in bleached corals. Given the size, geographical area and oceanic nature of corals, satellite remote sensing should be able to offer valuable support to the monitoring of their health.

Following the second global bleaching event, in 1997/98, research confirmed that 16 percent of the world’s coral died. Who knows what the outcome of the current event will be?

Sentinel’s Milestone and Millstone

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

Sentinel-1A multi-temporal colour composite of land coverage across Ireland. Contains modified Copernicus Sentinel data [2015], processed by ESA. Data courtesy of ESA.

There was be a significant milestone achieved for the European Commission’s Copernicus Programme with the launch of the Sentinel-1B satellite. It was the fourth satellite launched, and will complete the first of the planned constellations as the second Sentinel-1 satellite.

It was launched on 25th April from French Guiana. In addition, to Sentinel-1B, three student cubesats were onboard the Soyuz rocket. Students from the University of Liege, Polytechnic of Turin, Italy, and the University of Aalborg have developed 10cm square cubesats as part of ESA’s ‘Fly Your Satellite!’ programme which will be deployed into orbit.

Sentinel-1B is an identical twin to Sentinel-1A which was launched on the 3rd April 2014, and they will operate as a pair constellation orbiting 180 degrees apart at an altitude of approximately 700 km. They both carry a C-band Synthetic Aperture Radar (SAR) instrument and together will cover the entire planet every six days, although the Arctic will be revisited every day and Europe, Canada and main shipping routes every three days.

Sentinel-1 data has a variety of applications including monitoring sea ice, maritime surveillance, disaster humanitarian aid, mapping for forest, water and soil management. The benefits were demonstrated this week with:

  • Issuing a video showing the drop in rice-growing productivity in Mekong River Delta over the last year; and
  • The multi-temporal colour composite of land coverage of Ireland as shown at the top of this post. It was created from 16 radar scans over 12 days during May 2015, where:
    • The blues represent changes in water or agricultural activities such as ploughing, the yellows represent urban centres, vegetated fields and forests appear in green and the reds and oranges represent unchanging features such as bare soil.

With this constellation up and working, the revisit speed has the chance to be the game changer in the uptake of space generated data.

Sadly there’s a millstone hanging around the Copernicus Programme neck hindering this change – accessing the data remains difficult for commercial organisations.

Currently, selecting and downloading Sentinel data is a painful process, one that mostly either does not work, or is so slow you give up on it! This has been created by the size of the datasets and popularity of the data that’s free to access for everyone worldwide.

There are a number of ways of getting access to this data, with varying success in our experience, including:

  • EU’s Copernicus Hub – Operational, but slow to use. Once you have selected the data to download, either manually or via a script, the process is extremely slow and often times out before completing the downloading.
  • USGS – Offers Sentinel-2, but not Sentinel-1, data via it’s EarthExplorer and Glovis interfaces. The download process is easier, but the format of Sentinel-2 makes searching a bit strange in Glovis and it’s only a partial representation of the available acquisitions.
  • The UK Collaborative Ground Segment Access, despite signing an agreement with ESA in March 2015, has not yet been made available for commercial entities.
  • It is possible to apply for access to the academically focused STFC Centre for Environmental Data Analysis (CEDA) system, which provides FTP access, and that has good download speed’s for the data that’s available.
  • Amazon’s archive of Sentinel-2 data which has good download speeds, but is cumbersome to search without the development of software i.e. scripts.

There are also further services and routes being developed to facilitate searching and downloading from the various archives, e.g., there’s a QGIS ‘Semi-Automatic Classification’ plugin and EOProc SatCat service for Sentinel-2. With the Sentinel-3A data coming online soon the situation will get more complex for those of us trying to use data from all the Sentinel missions.

Getting the satellites into space is great, but that is only the first step in widening the use of space generated data. Until the data is put into the hands of people who use it to create value and inspire people, the Sentinel data will not fulfill its full potential in widening the use of space generated data.