Monitoring Water Quality from Space

Algal Blooms in Lake Erie, around Monroe, acquired by Sentinel-2 on 3rd August 2017. Data Courtesy of ESA/Copernicus.

Two projects using Earth Observation (EO) data to monitor water quality caught our eye recently. As we’re in process of developing two water quality products for our own online portal, we’re interested in what everyone else is doing!

At the end of January UNESCO’s International Hydrological Programme launched a tool to monitor global water quality. The International Initiative on Water Quality (IIWQ) World Water Quality Portal, built by EOMAP, provides:

  • turbidity and sedimentation distribution
  • chlorophyll-a concentration
  • Harmful Algal Blooms indicator
  • organic absorption
  • surface temperature

Based on optical data from Landsat and Sentinel-2 it can provide global surface water mosaics at 90 m spatial resolution, alongside 30 m resolution for seven pilot river basins.  The portal was launched in Paris at the “Water Quality Monitoring using Earth Observation and Satellite-based Information” meeting and was accompanied by an exhibition on “Water Quality from the Space – Mesmerizing Images of Earth Observation”.

The tool, which can be found here, focuses on providing colour visualizations of the data alongside data legends to help make it as easy as possible to use. It is hoped that this will help inform and educate policy makers, water professionals and the wider public about the value of using satellite data from monitoring water resources.

A second interesting project, albeit on a smaller scale, was announced last week which is going to use Sentinel-2 imagery to monitor water quality in Scottish Lochs. Dr Claire Neil, from the University of Stirling, is leading the project and will be working with Scottish Environment Protection Agency. It will use reflectance measures to estimate the chlorophyll-a concentrations to help identify algal blooms and other contaminants in the waters. The project will offer an alternative approach to the current water quality monitoring, which uses sampling close to the water’s edge.

An interesting feature of the project, particularly for us, is the intention to focus on developing this work into an operational capability for SEPA to enable them to improve their approach to assessing water quality.

This transition from a ‘good idea’ into an operational product that will be used, and therefore purchased, by end users is what all EO companies are looking for and we’re not different. Our Pixalytics Portal which we discussed a couple of weeks ago is one of the ways we are trying to move in that direction. We have two water quality monitoring products on it:

  • Open Ocean Water Quality product extracts time-series data from a variety of 4 km resolution satellite datasets from NASA, giving an overview what is happening in the water without the need to download a lot of data.
  • Planning for Coastal Airborne Lidar Surveys product provides an assessment of the penetration depth of a Lidar laser beam, from an airborne survey system, within coastal waters based on the turbidity of the water. This ensures that companies who plan overflights can have confidence in how far their Lidar will see.

We’re just at the starting point in productizing the services we offer, and so it is always good to see how others are approaching the similar problem!

To TEDx Speaking and Beyond!

Back in April I received an invitation to speak at the ‘One Step Beyond’ TEDx event organised at the National Space Centre in Leicester, with my focus on the Blue Economy and Earth Observation (EO).

We’ve been to a few TEDx events in the past and they’ve always been great, and so I was excited to have the opportunity to join this community. Normally, I’m pretty relaxed about public speaking. I spend a lot of time thinking about what I’m going to say, but don’t assemble my slides until a couple of days beforehand. This approach has developed in part because I used to lecture – where I got used to talking for a while with a few slides – but also because I always like to take some inspiration from the overall mood of the event I’m talking at. This can be through hearing other speakers, attending workshops or even just walking around the local area.

TEDx, however, was different. There was a need to have the talk ready early for previewing and feedback, alongside producing stunning visuals and having a key single message. So, for a change, I started with a storyboard.

My key idea was to get across the sense of wonder I and many other scientists share in observing the oceans from space, whilst also emphasising that anyone can get involved in protecting this natural resource. I echoed the event title by calling my talk “Beyond the blue ocean” as many people think of the ocean as just a blue waterbody. However, especially from space, we can see the beauty, and complexity, of colour variations influenced by the microscopic life and substances dissolved and suspended within it.

I began with an with an image called the ‘Pale Blue Dot’ that was taken by Voyager 1 at a distance of more than 4 billion miles from Earth, and then went with well-known ‘Blue Marble’ image before zooming into what we see from more conventional EO satellites. I also wanted to take the audience beyond just optical wavelengths and so displayed microwave imagery from Sentinel-1 that’s at a similar spatial resolution to my processed 15 m resolution Sentinel-2 data that was also shown.

Dr Samantha Lavender speaking at the One Step Beyond TEDx event in Leicester. Photo courtesy of TEDxLeicester

The satellite imagery included features such as wind farms, boats and phytoplankton blooms I intended to discuss. However, this didn’t quite to go to plan on my practice run through! The talk was in the planetarium at the National Space Centre, which meant the screen was absolutely huge – as you can see in the image to the right. However, with the lights on in the room the detail in the images was really difficult to see. The solution for the talk itself was to have the planetarium in darkness and myself picked out by two large spotlights, meaning that the image details were visible to the audience but I couldn’t see the audience myself.

The evening itself took place on the 21st September, and with almost two hundred in the audience I was up first. I was very happy with how it went and the people who spoke to me afterwards said they were inspired by what they’d seen. You can see for yourself, as the talk can be found here on the TEDx library. Let me know what you think!

I was followed by two other fantastic speakers who gave inspiring presentations and these are also up on the TEDx Library. Firstly, Dr Emily Shuckburgh, Deputy Head of Polar Oceans team at British Antarctic Survey discussed “How to conduct a planetary health check”; and she was followed by Corentin Guillo, CEO and Founder of Bird.i, who spoke about “Space entrepreneurship, when thinking outside the box is not enough”.

The whole event was hugely enjoyable and the team at TEDx Leicester did an amazing job of organising it. It was good to talk to people after the event, and it was fantastic that seventy percent of the audience were aged between 16 and 18. We need to do much more of this type of outreach activities to educate and inspire the next generation of scientists. Of course, for me, the day also means that I can now add TEDx Speaker to my biography!

Can You See The Great Wall of China From Space?

Area north of Beijing, China, showing the Great Wall of China running through the centre. Image acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

Dating back over two thousand three hundred years, the Great Wall of China winds its way from east to west across the northern part of the country. The current remains were built during Ming Dynasty and have a length of 8 851.8 km according to 2009 work by the Chinese State Administration of Cultural Heritage and National Bureau of Surveying and Mapping Agency. However, if you take into account the different parts of the wall built by other dynasties, its length is almost twenty two thousand kilometres.

The average height of the wall is between six and seven metres, and its width is between four to five metres. This width would allow five horses, or ten men, to walk side by side. The sheer size of the structure has led people to believe that it could be seen from space. This was first described by William Stukeley in 1754, when he wrote in reference to Hadrian’s Wall that ‘This mighty wall of four score miles in length is only exceeded by the Chinese Wall, which makes a considerable figure upon the terrestrial globe, and may be discerned at the Moon.’

Despite Stukeley’s personal opinion not having any scientific basis, it has been repeated many times since. By the time humans began to go into space, it was considered a fact. Unfortunately, astronauts such as Buzz Aldrin, Chris Hatfield and even China’s first astronaut, Yang Liwei, have all confirmed that the Great Wall is not visible from space by the naked eye. Even Pixalytics has got a little involved in this debate. Two years ago we wrote a blog saying that we couldn’t see the wall on Landsat imagery as the spatial resolution was not small enough to be able to distinguish it from its surroundings.

Anyone who is familiar with the QI television series on the BBC will know that they occasionally ask the same question in different shows and give different answers when new information comes to light. This time it’s our turn!

Last week Sam was a speaker at the TEDx One Step Beyond event at the National Space Centre in Leicester – you’ll hear more of that in a week or two. However, in exploring some imagery for the event we looked for the Great Wall of China within Sentinel-2 imagery. And guess what? We found it! In the image at the top, the Great Wall can be seen cutting down the centre from the top left.

Screenshot of SNAP showing area north of Beijing, China. Data acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

It was difficult to spot. The first challenge was getting a cloud free image of northern China, and we only found one covering our area of interest north of Beijing! Despite Sentinel-2 having 10 m spatial resolution for its visible wavelengths, as noted above, the wall is generally narrower. This means it is difficult to see the actual wall itself, but it is possible to see its path on the image. This ability to see very small things from space by their influence on their surroundings is similar to how we are able to spot microscopic phytoplankton blooms. The image on the right is a screenshot from Sentinel Application Platform tool (SNAP) which shows the original Sentinel-2 image of China on the top left and the zoomed section identifying the wall.

So whilst the Great Wall of China might not be visible from space with the naked eye, it is visible from our artificial eyes in the skies, like Sentinel-2.

Supporting Soil Fertility From Space

Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. Sentinel data courtesy of ESA/Copernicus.

Last Tuesday I was at the academic launch event for the Tru-Nject project at Cranfield University. Despite the event’s title, it was in fact an end of project meeting. Pixalytics has been involved in the project since July 2015, when we agreed to source and process high resolution satellite Earth Observation (EO) imagery for them.

The Tru-Nject project is funded via Innovate UK. It’s official title is ‘Tru-Nject: Proximal soil sensing based variable rate application of subsurface fertiliser injection in vegetable/ combinable crops’. The focus is on modelling soil fertility within fields, to enable fertiliser to be applied in varying amounts using point-source injection technology which reduces the nitrogen loss to the atmosphere when compared with spreading fertiliser on the soil surface.

To do this the project created soil fertility maps from a combination of EO products, physical sampling and proximal soil sensing – where approximately 15 000 georeferenced hyperspectral spectra are collected using an instrument connected to a tractor. These fertility maps are then interpreted by an agronomist, who decides on the relative application of fertiliser.

Initial results have shown that applying increased fertiliser to areas of low fertility improves overall yield when compared to applying an equal amount of fertiliser everywhere, or applying more fertiliser to high yield areas.

Pixalytics involvement in the work focussed on acquiring and processing, historical, and new, sub 5 metre optical satellite imagery for two fields, near Hull and York. We have primarily acquired data from the Kompsat satellites operated by the Korea Aerospace Research Institute (KARI), supplemented with WorldView data from DigitalGlobe. Once we’d acquired the imagery, we processed it to:

  • remove the effects of the atmosphere, termed atmospheric correction, and then
  • converted them to maps of vegetation greenness

The new imagery needed to coincide with a particular stage of crop growth, which meant the satellite data acquisition period was narrow. This led to a pleasant surprise for Dave George, Tru-Nject Project Manager, who said, “I never believed I’d get to tell a satellite what to do.’ To ensure that we collected data on specific days we did task the Kompsat satellites each year.

Whilst we were quite successful with the tasking the combination of this being the UK, and the fact that the fields were relatively small, meant that some of the images were partly affected by cloud. Where this occurred we gap-filled with Copernicus Sentinel-2 data, it has coarser spatial resolution (15m), but more regular acquisitions.

In addition, we also needed to undertake vicarious adjustment to ensure that we produced consistent products over time whilst the data came from different sensors with different specifications. As we cannot go to the satellite to measure its calibration, vicarious adjustment is a technique which uses ground measurements and algorithms to not only cross-calibrate the data, but also adjusts for errors in the atmospheric correction.

An example of the work is at the top, which shows a Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. The greener the NDVI product the more green the vegetation is, although the two datasets were collected in different years so the planting within the field varies.

We’ve really enjoyed working with Stockbridge Technology Centre Ltd (STC), Manterra Ltd, and Cranfield University, who were the partners in the project. Up until last week all the work was done via telephone and email, and so it was great to finally meet them in-person, hear about the successful project and discuss ideas for the future.

Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Remote Sensing Goes Cold

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Remote sensing over the Polar Regions has poked its head above the ice recently.

On the 8th February The Cryosphere, a journal of the European Geosciences Union, published a paper by Smith et al titled ’Connected sub glacial lake drainage beneath Thwaites Glacier, West Antarctica’. It described how researchers used data from ESA’s CryoSat-2 satellite to look at lakes beneath a glacier.

This work is interesting from a remote sensing viewpoint as it is a repurposing of Cryosat-2’s mission. It’s main purpose is to measure the thickness of the ice sheets and marine ice cover using its Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter, known as SIRAL, and it can detect millimetre changes in the elevation of both ice-sheets and sea-ice.

The team were able to use this data to determine that the ice of the glacier had subsided by several metres as water had drained away from four lakes underneath. Whilst the whole process took place between June 2012 and January 2014, the majority of the drainage happened in a six month period. During this time it’s estimated that peak drainage was around 240 cubic metre per second, which is four times faster than the outflow of the River Thames into the North Sea.

We’ve previously highlighted that repurposing data – using data for more purposes than originally intended – is going to be one of the key future innovation trends for Earth Observation.

Last week, ESA also described how Sentinel-1 and Sentinel-2 data have been used over the last five months to monitor a crack in the ice near to the Halley VI research base of the British Antarctic Survey (BAS). The crack, known as Halloween Crack, is located on the Brunt ice Shelf in the Wedell Sea sector of Antarctica and was identified last October. The crack grew around 600 m per day during November and December, although it has since slowed to only one third of that daily growth.

Since last November Sentinel-2 has been acquiring optical images at each overflight, and this has been combined with SAR data from the two Sentinel-1 satellites. This SAR data will be critical during the Antarctic winter when there are only a few hours of daylight and a couple of weeks around mid-June when the sun does not rise.

This work hit the headlines as BAS decided to evacuate their base for the winter, due to the potential threat. The Halley VI base, which was only 17km from the crack, is the first Antarctic research station to be specifically designed to allow relocation to cope with this sort of movement in the ice shelf. It was already planned to move the base 23 km further inland, and this was successfully completed on the 2nd February. Further movement will depend on how the Halloween Crack develops over the winter.

Finally, the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) project was announced this week at the annual meeting of the American Association for the Advancement of Science. Professor Markus Rex outlined the project, which will sail a research vessel into the Arctic sea ice and let it get stuck so it can drift across the North Pole. The vessel will be filled with a variety of remote sensing in-situ instruments, and will aim to collect data on how the climate is changing in this part of the world through measuring the atmosphere-ice-ocean system.

These projects show that the Polar Regions have a lot of interest, and variety, for remote sensing.

Supporting Chimpanzee Conservation from Space

Gombe National Park, Tanzania. Acquired by Sentinel-2 in December 2016. Image courtesy of ESA.

Being able to visualise the changing face of the planet over time is one of the greatest strengths of satellite remote sensing. Our previous blog showed how Dubai’s coastline has evolved over a decade, and last week NASA described interesting work they’re doing on monitoring habitat loss for chimpanzees in conjunction with the Jane Goodall Institute.

Jane Goodall has spent over fifty years working to protect and conserve chimpanzees from the Gombe National Park in Tanzania, and formed the Jane Goodall Institute in 1977. The Institute works with local communities to provide sustainable conservation programmes.

A hundred years ago more than one million chimpanzees lived in Africa, today the World Wildlife Fund estimate the population may only be around 150,000 to 250,000. The decline is stark. For example, the Ivory Coast populations have declined by 90% within the last twenty years.

One of the key factors contributing to this decline is habitat loss, mostly through deforestation; although other factors such as hunting, disease and illegal capture also contributed.

Forests cover around 31% of the planet, and deforestation occurs when trees are removed and the land has another use instead of being a forest. In chimpanzee habitats, the deforestation is mostly due to logging, mining and drilling for oil. This change in land use can be monitored from space using remote sensing. Satellites produce regular images which can be used to monitor changes in the natural environment, in turn giving valuable information to conservation charities and other organisations.

In 2000 Lilian Pintea, from the Jane Goodall Institute, was shown Landsat images comparing the area around the Gombe National Park in 1972 and 1999. The latter image showed huge deforestation outside the park’s boundary. The Institute have continued to use Landsat imagery to monitor what is happening around the National Park. In 2009 they began a citizen science project with local communities giving them smartphones to report their observations. Combining these with ongoing satellite data from NASA has helped develop and implement local plans for land use and protection of the forests. Further visualisation of this work can be found here. The image at the top was acquired Sentinel-2 in December 2016 and shows the Gombe National Park, although it is under a little haze.

The satellite data supplied by NASA comes from the Landsat missions, which currently have an archive of almost forty-five years of satellite data, which is freely available to anyone. We also used Landsat for data in our Dubai animation last week. Landsat captures optical data, which means it operates in a similar manner to the human eye – although the instruments also have infrared capabilities. However, one drawback of optical instruments is that they cannot see through clouds. Therefore, whilst Landsat is great for monitoring land use when there are clear skies, it can be combined with synthetic aperture radar (SAR), from the microwave spectrum, as it can see through both clouds and smoke. This combination enables land use and land change to monitored anywhere in the world. Using the freely available Landsat and Sentinel-1 SAR data you could monitor what is happening to the forests in your neighbourhoods.

Satellite data is powerful tool for monitoring changes in the environment, and with the archive of data available offers a unique opportunity to see what has happened over the last four decades.

Remote Sensing: Learning, Learned & Rewritten

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

This blog post is about what I did and what thoughts came to mind on my three-month long ERASMUS+ internship at Pixalytics which began in July and ends this week.

During my first week at Pixalytics, after being introduced to the Plymouth Science Park buildings and the office, my first task was to get a basic understanding of what remote sensing is actually about. With the help of Sam and Andy’s book, Practical Handbook of Remote Sensing, that was pretty straightforward.

As the words suggest, remote sensing is the acquisition of data and information on an object without the need of being on the site. It is then possible to perform a variety of analysis and processing on this data to better understand and study physical, chemical and biological phenomena that affect the environment.

Examples of programming languages: C, Python & IDL

Examples of programming languages: C, Python & IDL

I soon realized that quite a lot of programming was involved in the analysis of satellite data. In my point of view, though, some of the scripts, written in IDL (Interactive Data Language), were not as fast and efficient as they could be, sometimes not at all. With that in mind, I decided to rewrite one of the scripts, turning it into a C program. This allowed me to get a deeper understanding of satellite datasets formats (e.g. HDF, Hierarchical Data Format) and improve my overall knowledge of remote sensing.

While IDL, a historic highly scientific language for remote sensing, provides a quick way of writing code, it has a number of glaring downsides. Poor memory management and complete lack of strictness often lead to scripts that will easily break. Also, it’s quite easy to write not-so-pretty and confusing spaghetti code, i.e., twisted and tangled code.

Writing C code, on the other hand, can get overly complicated and tedious for some tasks that would require just a few lines in IDL. While it gives the programmer almost full control of what’s going on, some times it’s just not worth the time and effort.

Instead, I chose to rewrite the scripts in Python which I found to be quite a good compromise. Indentation can sometimes be a bit annoying, and coming from other languages the syntax might seem unusual, but its great community and the large availability of modules to achieve your goals in just a few lines really make up for it.

It was soon time to switch to a bigger and more complex task, which has been, to this day, what I would call my “main task” during my time at Pixalytics: building an automated online processing website. The website aspect was relatively easy with a combination of the usual HTML, Javascript, PHP and CSS, it was rewriting and integrated the remote sensing scripts that was difficult. Finally all of those little, and sometimes not quite so little, scripts and programs were available from a convenient web interface, bringing much satisfaction and pride for all those hours of heavy thinking and brainstorming. Hopefully, you will read more about this development in the future from Pixalytics, as it will form the back-end of their product suite to be launched in the near future.

During my internship there was also time for events inside the Science Park such as the Hog Roast, and events outside as well when I participated at the South-West England QGIS User Group meeting in Dartmoor National Park. While it is not exactly about remote sensing, but more on the Geographic Information System (GIS) topic it made me realize how much I had learned on remote sensing in my short time at Pixalytics, I was able to exchange my opinions and points of view with other people that were keen on the subject.

A side project I’ve been working on in my final weeks was looking at the world to find stunning, interesting (and possibly both) places on Earth to make postcards from – such as one at the top of the blog. At times, programming and scientific research reads can get challenging and/or frustrating, and it’s so relaxing to just look at and enjoy the beauty of our planet.

It is something that anyone can do as it takes little knowledge about remote sensing. Free satellite imagery is available through a variety of sources; what I found to be quite easy to access and use was imagery from USGS/NASA Landsat-8 and ESA Sentinel-2. It is definitely something I would recommend.

Finally, I want to say “thank you” to Sam and Andy, without whom I would have never had the opportunity to get the most out of this experience, in a field in which I’ve always been interested into, but had never had the chance to actually get my hands on.

Blog written by Davide Mainas on an ERASMUS+ internship with Pixalytics via the Tellus Group.