Can You See The Great Wall of China From Space?

Area north of Beijing, China, showing the Great Wall of China running through the centre. Image acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

Dating back over two thousand three hundred years, the Great Wall of China winds its way from east to west across the northern part of the country. The current remains were built during Ming Dynasty and have a length of 8 851.8 km according to 2009 work by the Chinese State Administration of Cultural Heritage and National Bureau of Surveying and Mapping Agency. However, if you take into account the different parts of the wall built by other dynasties, its length is almost twenty two thousand kilometres.

The average height of the wall is between six and seven metres, and its width is between four to five metres. This width would allow five horses, or ten men, to walk side by side. The sheer size of the structure has led people to believe that it could be seen from space. This was first described by William Stukeley in 1754, when he wrote in reference to Hadrian’s Wall that ‘This mighty wall of four score miles in length is only exceeded by the Chinese Wall, which makes a considerable figure upon the terrestrial globe, and may be discerned at the Moon.’

Despite Stukeley’s personal opinion not having any scientific basis, it has been repeated many times since. By the time humans began to go into space, it was considered a fact. Unfortunately, astronauts such as Buzz Aldrin, Chris Hatfield and even China’s first astronaut, Yang Liwei, have all confirmed that the Great Wall is not visible from space by the naked eye. Even Pixalytics has got a little involved in this debate. Two years ago we wrote a blog saying that we couldn’t see the wall on Landsat imagery as the spatial resolution was not small enough to be able to distinguish it from its surroundings.

Anyone who is familiar with the QI television series on the BBC will know that they occasionally ask the same question in different shows and give different answers when new information comes to light. This time it’s our turn!

Last week Sam was a speaker at the TEDx One Step Beyond event at the National Space Centre in Leicester – you’ll hear more of that in a week or two. However, in exploring some imagery for the event we looked for the Great Wall of China within Sentinel-2 imagery. And guess what? We found it! In the image at the top, the Great Wall can be seen cutting down the centre from the top left.

Screenshot of SNAP showing area north of Beijing, China. Data acquired by Sentinel-2 on 27th June 2017. Data courtesy of ESA/Copernicus.

It was difficult to spot. The first challenge was getting a cloud free image of northern China, and we only found one covering our area of interest north of Beijing! Despite Sentinel-2 having 10 m spatial resolution for its visible wavelengths, as noted above, the wall is generally narrower. This means it is difficult to see the actual wall itself, but it is possible to see its path on the image. This ability to see very small things from space by their influence on their surroundings is similar to how we are able to spot microscopic phytoplankton blooms. The image on the right is a screenshot from Sentinel Application Platform tool (SNAP) which shows the original Sentinel-2 image of China on the top left and the zoomed section identifying the wall.

So whilst the Great Wall of China might not be visible from space with the naked eye, it is visible from our artificial eyes in the skies, like Sentinel-2.

Supporting Soil Fertility From Space

Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. Sentinel data courtesy of ESA/Copernicus.

Last Tuesday I was at the academic launch event for the Tru-Nject project at Cranfield University. Despite the event’s title, it was in fact an end of project meeting. Pixalytics has been involved in the project since July 2015, when we agreed to source and process high resolution satellite Earth Observation (EO) imagery for them.

The Tru-Nject project is funded via Innovate UK. It’s official title is ‘Tru-Nject: Proximal soil sensing based variable rate application of subsurface fertiliser injection in vegetable/ combinable crops’. The focus is on modelling soil fertility within fields, to enable fertiliser to be applied in varying amounts using point-source injection technology which reduces the nitrogen loss to the atmosphere when compared with spreading fertiliser on the soil surface.

To do this the project created soil fertility maps from a combination of EO products, physical sampling and proximal soil sensing – where approximately 15 000 georeferenced hyperspectral spectra are collected using an instrument connected to a tractor. These fertility maps are then interpreted by an agronomist, who decides on the relative application of fertiliser.

Initial results have shown that applying increased fertiliser to areas of low fertility improves overall yield when compared to applying an equal amount of fertiliser everywhere, or applying more fertiliser to high yield areas.

Pixalytics involvement in the work focussed on acquiring and processing, historical, and new, sub 5 metre optical satellite imagery for two fields, near Hull and York. We have primarily acquired data from the Kompsat satellites operated by the Korea Aerospace Research Institute (KARI), supplemented with WorldView data from DigitalGlobe. Once we’d acquired the imagery, we processed it to:

  • remove the effects of the atmosphere, termed atmospheric correction, and then
  • converted them to maps of vegetation greenness

The new imagery needed to coincide with a particular stage of crop growth, which meant the satellite data acquisition period was narrow. This led to a pleasant surprise for Dave George, Tru-Nject Project Manager, who said, “I never believed I’d get to tell a satellite what to do.’ To ensure that we collected data on specific days we did task the Kompsat satellites each year.

Whilst we were quite successful with the tasking the combination of this being the UK, and the fact that the fields were relatively small, meant that some of the images were partly affected by cloud. Where this occurred we gap-filled with Copernicus Sentinel-2 data, it has coarser spatial resolution (15m), but more regular acquisitions.

In addition, we also needed to undertake vicarious adjustment to ensure that we produced consistent products over time whilst the data came from different sensors with different specifications. As we cannot go to the satellite to measure its calibration, vicarious adjustment is a technique which uses ground measurements and algorithms to not only cross-calibrate the data, but also adjusts for errors in the atmospheric correction.

An example of the work is at the top, which shows a Sentinel-2 pseudo-true colour composite from 2016 with a Kompsat-3 Normalized Difference Vegetation Index (NDVI) product from 2015 inset. The greener the NDVI product the more green the vegetation is, although the two datasets were collected in different years so the planting within the field varies.

We’ve really enjoyed working with Stockbridge Technology Centre Ltd (STC), Manterra Ltd, and Cranfield University, who were the partners in the project. Up until last week all the work was done via telephone and email, and so it was great to finally meet them in-person, hear about the successful project and discuss ideas for the future.

Great Barrier Reef Coral Bleaching

Great Barrier Reef off the east coast of Australia where currents swirl in the water around corals. Image acquired by Landsat-8 on 23 August 2013. Image Courtesy of USGS/ESA.

Coral bleaching on the Great Barrier Reef in Australia was worse than expected last year, and a further decline is expected in 2017 according to the Great Barrier Reef Marine Park Authority. In a document issued this week they noted that, along with reefs across the world, the Great Barrier Reef has had widespread coral decline and habitat loss over the last two years.

We’ve written about coral bleaching before, as it’s a real barometer of climate change. To put the importance of the Great Barrier Reef into context:

  • It’s 2300 km long and covers an area of around 70 million football pitches;
  • Consists of 3000 coral reefs, which are made up from 650 different types of hard and soft coral; and
  • Is home to over 1500 types of fish and more than 100 varieties of sharks and rays.

Coral bleaching occurs when water stress causes coral to expel the photosynthetic algae, which give coral their colours, exposing the skeleton and turning them white. The stress is mostly due to higher seawater temperatures; although cold water stresses, run-off, pollution and high solar irradiance can also cause bleaching. Whilst bleaching does not kill coral immediately, it does put them at a greater risk of mortality from storms, poor water quality, disease and the crown-of-thorns starfish.

Last year the Great Barrier Reef suffered its worst bleaching on record, aerial and in-water surveys identified that 29% of shallow water coral reefs died in 2016; up from the original estimation of 22%. The most severe mortality was in an area to the north of Port Douglas where 70% of the shallow water corals died. This is hugely sad news to Sam and I, as we explored this area of the Great Barrier Reef ourselves about fifteen years ago.

Whilst hugely concerning, there is also a little hope! There was a strong recovery of coral in the south of the Great Barrier Reef, as bleaching and other impacts were less.

Images from the Copernicus Sentinel-2A satellite captured on 8 June 2016 and 23 February 2017 show coral turning bright white for Adelaide Reef, Central Great Barrier Reef. Data courtesy of Copernicus/ESA, and contains modified Copernicus Sentinel data (2016–17), processed by J. Hedley; conceptual model by C. Roelfsema

The coral bleaching event this year has also been captured by Sentinel-2. Scientists from ESA’s Sen2Coral project have used change detection techniques to determine bleaching. Images between January and April showed areas of coral turning bright white and then darkening, although it was unclear whether the darkening was due to coral recovery or dead coral being overgrown with algae. In-water surveys were undertaken, which confirmed the majority of the darkened areas were algal overgrowth.

This work has proved that coral bleaching can be seen from space, although it needs to be supported by in-situ work. ESA intends to develop a coral reef tool, which will be part of the open-source Sentinel Application Platform (SNAP) toolkit. This will enable anyone to monitor the health of coral reefs worldwide and hopefully, help protect these natural wonders.

Monitoring Fires From Space

Monitoring fires from space has significant advantages when compared to on-ground activity. Not only are wider areas easier to monitor, but there are obvious safety benefits too. The different ways this can be done have been highlighted through a number of reports over the last few weeks.

VIIRS Image from 25 April 2017, of the Yucatán Peninsula showing where thermal bands have picked-up increased temperatures. Data Courtesy of NASA, NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Firstly, NASA have released images from different instruments, on different satellites, that illustrate two ways of how satellites can monitor fires.

Acquired on the 25 April 2017, an image from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite showed widespread fire activity across the Yucatán Peninsula in South America. The image to the right is a natural colour image and each of the red dots represents a point where the instrument’s thermal band detected temperatures higher than normal.

False colour image of the West Mims fire on Florida/Georgia boundary acquired by MODIS on 02 May 2017. Data courtesy of NASA. NASA image by Jeff Schmaltz, LANCE/EOSDIS Rapid Response.

Compare this to a wildfire on Florida-Georgia border acquired from NASA’s Aqua satellite on the 02 May 2017 using the Moderate Resolution Imaging Spectroradiometer (MODIS). On the natural colour image the fires could only be seen as smoke plumes, but on the left is the false colour image which combines infrared, near-infrared and green wavelengths. The burnt areas can be clearly seen in brown, whilst the fire itself is shown as orange.

This week it was reported that the Punjab Remote Sensing Centre in India, has been combining remote sensing, geographical information systems and Global Positioning System (GPS) data to identify the burning of crop stubble in fields; it appears that the MODIS fire products are part of contributing the satellite data. During April, 788 illegal field fires were identified through this technique and with the GPS data the authorities have been able to identify, and fine, 226 farmers for undertaking this practice.

Imaged by Sentinel-2, burnt areas, shown in shades of red and purple, in the Marantaceae forests in the north of the Republic of Congo.
Data courtesy of Copernicus/ESA. Contains modified Copernicus Sentinel data (2016), processed by ESA.

Finally, a report at the end of April from the European Space Agency described how images from Sentinel-1 and Senintel-2 have been combined to assess the amount of forest that was burnt last year in the Republic of Congo in Africa – the majority of which was in Marantaceae forests. As this area has frequent cloud cover, the optical images from Sentinel-2 were combined with the Synthetic Aperture Radar (SAR) images from Sentinel-1 that are unaffected by the weather to offer an enhanced solution.

Sentinel-1 and Sentinel-2 data detect and monitor forest fires at a finer temporal and spatial resolution than previously possible, namely 10 days and 10 m, although the temporal resolution will increase to 5 days later this year when Sentinel-2B becomes fully operational.  Through this work, it was estimated that 36 000 hectares of forest were burnt in 2016.

Given the danger presented by forest fires and wildfires, greater monitoring from space should improve fire identification and emergency responses which should potentially help save lives. This is another example of the societal benefit of satellite remote sensing.

Remote Sensing Goes Cold

Average thickness of Arctic sea ice in spring as measured by CryoSat between 2010 and 2015. Image courtesy of ESA/CPOM

Remote sensing over the Polar Regions has poked its head above the ice recently.

On the 8th February The Cryosphere, a journal of the European Geosciences Union, published a paper by Smith et al titled ’Connected sub glacial lake drainage beneath Thwaites Glacier, West Antarctica’. It described how researchers used data from ESA’s CryoSat-2 satellite to look at lakes beneath a glacier.

This work is interesting from a remote sensing viewpoint as it is a repurposing of Cryosat-2’s mission. It’s main purpose is to measure the thickness of the ice sheets and marine ice cover using its Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter, known as SIRAL, and it can detect millimetre changes in the elevation of both ice-sheets and sea-ice.

The team were able to use this data to determine that the ice of the glacier had subsided by several metres as water had drained away from four lakes underneath. Whilst the whole process took place between June 2012 and January 2014, the majority of the drainage happened in a six month period. During this time it’s estimated that peak drainage was around 240 cubic metre per second, which is four times faster than the outflow of the River Thames into the North Sea.

We’ve previously highlighted that repurposing data – using data for more purposes than originally intended – is going to be one of the key future innovation trends for Earth Observation.

Last week, ESA also described how Sentinel-1 and Sentinel-2 data have been used over the last five months to monitor a crack in the ice near to the Halley VI research base of the British Antarctic Survey (BAS). The crack, known as Halloween Crack, is located on the Brunt ice Shelf in the Wedell Sea sector of Antarctica and was identified last October. The crack grew around 600 m per day during November and December, although it has since slowed to only one third of that daily growth.

Since last November Sentinel-2 has been acquiring optical images at each overflight, and this has been combined with SAR data from the two Sentinel-1 satellites. This SAR data will be critical during the Antarctic winter when there are only a few hours of daylight and a couple of weeks around mid-June when the sun does not rise.

This work hit the headlines as BAS decided to evacuate their base for the winter, due to the potential threat. The Halley VI base, which was only 17km from the crack, is the first Antarctic research station to be specifically designed to allow relocation to cope with this sort of movement in the ice shelf. It was already planned to move the base 23 km further inland, and this was successfully completed on the 2nd February. Further movement will depend on how the Halloween Crack develops over the winter.

Finally, the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) project was announced this week at the annual meeting of the American Association for the Advancement of Science. Professor Markus Rex outlined the project, which will sail a research vessel into the Arctic sea ice and let it get stuck so it can drift across the North Pole. The vessel will be filled with a variety of remote sensing in-situ instruments, and will aim to collect data on how the climate is changing in this part of the world through measuring the atmosphere-ice-ocean system.

These projects show that the Polar Regions have a lot of interest, and variety, for remote sensing.

Supporting Chimpanzee Conservation from Space

Gombe National Park, Tanzania. Acquired by Sentinel-2 in December 2016. Image courtesy of ESA.

Being able to visualise the changing face of the planet over time is one of the greatest strengths of satellite remote sensing. Our previous blog showed how Dubai’s coastline has evolved over a decade, and last week NASA described interesting work they’re doing on monitoring habitat loss for chimpanzees in conjunction with the Jane Goodall Institute.

Jane Goodall has spent over fifty years working to protect and conserve chimpanzees from the Gombe National Park in Tanzania, and formed the Jane Goodall Institute in 1977. The Institute works with local communities to provide sustainable conservation programmes.

A hundred years ago more than one million chimpanzees lived in Africa, today the World Wildlife Fund estimate the population may only be around 150,000 to 250,000. The decline is stark. For example, the Ivory Coast populations have declined by 90% within the last twenty years.

One of the key factors contributing to this decline is habitat loss, mostly through deforestation; although other factors such as hunting, disease and illegal capture also contributed.

Forests cover around 31% of the planet, and deforestation occurs when trees are removed and the land has another use instead of being a forest. In chimpanzee habitats, the deforestation is mostly due to logging, mining and drilling for oil. This change in land use can be monitored from space using remote sensing. Satellites produce regular images which can be used to monitor changes in the natural environment, in turn giving valuable information to conservation charities and other organisations.

In 2000 Lilian Pintea, from the Jane Goodall Institute, was shown Landsat images comparing the area around the Gombe National Park in 1972 and 1999. The latter image showed huge deforestation outside the park’s boundary. The Institute have continued to use Landsat imagery to monitor what is happening around the National Park. In 2009 they began a citizen science project with local communities giving them smartphones to report their observations. Combining these with ongoing satellite data from NASA has helped develop and implement local plans for land use and protection of the forests. Further visualisation of this work can be found here. The image at the top was acquired Sentinel-2 in December 2016 and shows the Gombe National Park, although it is under a little haze.

The satellite data supplied by NASA comes from the Landsat missions, which currently have an archive of almost forty-five years of satellite data, which is freely available to anyone. We also used Landsat for data in our Dubai animation last week. Landsat captures optical data, which means it operates in a similar manner to the human eye – although the instruments also have infrared capabilities. However, one drawback of optical instruments is that they cannot see through clouds. Therefore, whilst Landsat is great for monitoring land use when there are clear skies, it can be combined with synthetic aperture radar (SAR), from the microwave spectrum, as it can see through both clouds and smoke. This combination enables land use and land change to monitored anywhere in the world. Using the freely available Landsat and Sentinel-1 SAR data you could monitor what is happening to the forests in your neighbourhoods.

Satellite data is powerful tool for monitoring changes in the environment, and with the archive of data available offers a unique opportunity to see what has happened over the last four decades.

Remote Sensing: Learning, Learned & Rewritten

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

Image of Yemen acquired by Sentinel-2 in August 2015. Data courtesy of ESA.

This blog post is about what I did and what thoughts came to mind on my three-month long ERASMUS+ internship at Pixalytics which began in July and ends this week.

During my first week at Pixalytics, after being introduced to the Plymouth Science Park buildings and the office, my first task was to get a basic understanding of what remote sensing is actually about. With the help of Sam and Andy’s book, Practical Handbook of Remote Sensing, that was pretty straightforward.

As the words suggest, remote sensing is the acquisition of data and information on an object without the need of being on the site. It is then possible to perform a variety of analysis and processing on this data to better understand and study physical, chemical and biological phenomena that affect the environment.

Examples of programming languages: C, Python & IDL

Examples of programming languages: C, Python & IDL

I soon realized that quite a lot of programming was involved in the analysis of satellite data. In my point of view, though, some of the scripts, written in IDL (Interactive Data Language), were not as fast and efficient as they could be, sometimes not at all. With that in mind, I decided to rewrite one of the scripts, turning it into a C program. This allowed me to get a deeper understanding of satellite datasets formats (e.g. HDF, Hierarchical Data Format) and improve my overall knowledge of remote sensing.

While IDL, a historic highly scientific language for remote sensing, provides a quick way of writing code, it has a number of glaring downsides. Poor memory management and complete lack of strictness often lead to scripts that will easily break. Also, it’s quite easy to write not-so-pretty and confusing spaghetti code, i.e., twisted and tangled code.

Writing C code, on the other hand, can get overly complicated and tedious for some tasks that would require just a few lines in IDL. While it gives the programmer almost full control of what’s going on, some times it’s just not worth the time and effort.

Instead, I chose to rewrite the scripts in Python which I found to be quite a good compromise. Indentation can sometimes be a bit annoying, and coming from other languages the syntax might seem unusual, but its great community and the large availability of modules to achieve your goals in just a few lines really make up for it.

It was soon time to switch to a bigger and more complex task, which has been, to this day, what I would call my “main task” during my time at Pixalytics: building an automated online processing website. The website aspect was relatively easy with a combination of the usual HTML, Javascript, PHP and CSS, it was rewriting and integrated the remote sensing scripts that was difficult. Finally all of those little, and sometimes not quite so little, scripts and programs were available from a convenient web interface, bringing much satisfaction and pride for all those hours of heavy thinking and brainstorming. Hopefully, you will read more about this development in the future from Pixalytics, as it will form the back-end of their product suite to be launched in the near future.

During my internship there was also time for events inside the Science Park such as the Hog Roast, and events outside as well when I participated at the South-West England QGIS User Group meeting in Dartmoor National Park. While it is not exactly about remote sensing, but more on the Geographic Information System (GIS) topic it made me realize how much I had learned on remote sensing in my short time at Pixalytics, I was able to exchange my opinions and points of view with other people that were keen on the subject.

A side project I’ve been working on in my final weeks was looking at the world to find stunning, interesting (and possibly both) places on Earth to make postcards from – such as one at the top of the blog. At times, programming and scientific research reads can get challenging and/or frustrating, and it’s so relaxing to just look at and enjoy the beauty of our planet.

It is something that anyone can do as it takes little knowledge about remote sensing. Free satellite imagery is available through a variety of sources; what I found to be quite easy to access and use was imagery from USGS/NASA Landsat-8 and ESA Sentinel-2. It is definitely something I would recommend.

Finally, I want to say “thank you” to Sam and Andy, without whom I would have never had the opportunity to get the most out of this experience, in a field in which I’ve always been interested into, but had never had the chance to actually get my hands on.

Blog written by Davide Mainas on an ERASMUS+ internship with Pixalytics via the Tellus Group.

Gathering of the UK Remote Sensing Clans

RSPSOC

The Remote Sensing & Photogrammetry Society (RSPSoc) 2016 Annual Conference is taking place this week, hosted by the University of Nottingham and the British Geological Society. Two Pixalytics staff, Dr Sam Lavender and Dr Louisa Reynolds, left Plymouth on a cold wet day on Monday, and arrived in the Nottinghamshire sunshine as befits RSPSoc week. The conference runs for three days and gives an opportunity to hear about new developments and research within remote sensing. Both Sam and Louisa are giving presentations this year.

Tuesday morning began with the opening keynote presentation given by Stephen Coulson of the European Space Agency (ESA), which discussed their comprehensive programme including the Copernicus and Earth Explorer missions. The Copernicus missions are generating ten times more data than similar previous missions, which presents logistical, processing and storage challenges for users. The future vision is to bring the user to the data, rather than the other way around. However, the benefits of cloud computing are still to be fully understood and ESA are interested in hearing about applications that couldn’t be produced with the IT technology we had 5 years ago.

After coffee Sam chaired the commercial session titled ‘The challenges (and rewards) of converting scientific research into commercial products.’ It started with three short viewpoint presentations from Jonathan Shears (Telespazio VEGA UK), Dr Sarah Johnson (University of Leicester) and Mark Jarman (Satellite Applications Catapult), and then moved into an interactive debate. It was great to see good attendance and a lively discussion ensued. Sam is planning to produce a white paper, with colleagues, based on the session. Some of the key points included:

  • Informative websites so people know what you do
  • Working with enthusiastic individuals as they will make sure something happens, and
  • To have a strong commercial business case alongside technical feasibility.
Dr Louisa Reynolds, Pixalytics Ltd, giving a presentation at RSPSoc 2016

Dr Louisa Reynolds, Pixalytics Ltd, giving a presentation at RSPSoc 2016

Louisa presented on Tuesday afternoon within the Hazards and Disaster Risk Reduction session. Her presentation was ‘A semi-automated flood mapping procedure using statistical SAR backscatter analysis’ which summarised the work Pixalytics has been doing over the last year on flood mapping which was funded by the Space for Smarter Government Programme (SSGP). Louisa was the third presenter who showed Sentinel-1 flood maps of York, and so it was a popular topic!

Alongside Louisa’s presentation, there have some fascinating other talks on topics as varied as:

  • Detecting and monitoring artisanal oil refining in the Niger Delta
  • Night time lidar reading of long-eroded gravestones
  • Photogrammatic maps of ancient water management features in Al-Jufra, Libya.
  • Seismic risk in Crete; and
  • Activities of Map Action

Although for Louisa her favourite part so far was watching a video of the launch of Sentinel 1A, through the Soyuz VS07 rocket’s discarding and deployment stages, simultaneously filmed from the craft and from the ground.

Just so you don’t think the whole event is about remote sensing, the conference also has a thriving social scene. On Monday there was a tour of The City Ground, legendary home of Nottingham Forest, by John McGovern who captained Forest to successive European Cup’s in 1979 and 1980. It was a great event and it was fascinating to hear about the irascible leadership style of Brian Clough. Tuesday’s event was a tour round the spooky Galleries of Justice Museum.

The society’s Annual General Meeting takes place on Wednesday morning; Sam’s presentation, ‘Monitoring Land Cover Dynamics: Bringing together Landsat-8 and Sentinel-2 data’, is in the Land Use/Land Cover Mapping session which follows.

The start of RSPSoc has been great as usual, offering chances to catch up with old remote sensing friends and meet some new ones. We are looking forward to rest of the conference and 2017!

Spinning Python in Green Spaces

2016 map of green spaces in Plymouth, using Sentinel-2 data courtesy of Copernicus/ESA.

2016 map of green spaces in Plymouth, using Sentinel-2 data courtesy of Copernicus/ESA.

As students, we are forever encouraged to find work experience to develop our real-life skills and enhance our CV’s. During the early period of my second year I was thinking about possible work experience for the following summer. Thanks to my University department, I was able to find the Space Placements in INdustry (SPIN) scheme. SPIN has been running for 4 years now, advertising short summer placements at host companies. These provide a basis for which students with degrees involving maths/physics/computer science can get an insight into the thriving space sector. I chose to apply to Pixalytics, and three months later they accepted my application in late March.

Fast forward a few more months and I was on the familiar train down to Plymouth in my home county of Devon. Regardless of your origin, living in a new place never fails to confuse, but with perseverance, I managed to settle in quickly. In the same way I could associate my own knowledge from my degree (such as atmospheric physics, and statistics) to the subject of remote sensing, a topic which I had not previously learnt about. Within a few days I was at work on my own projects learning more on the way.

My first task was an informal investigation into Open data that Plymouth City Council (PCC) has recently uploaded onto the web. PCC are looking for ways to create and support innovative business ideas that could potentially use open data. Given their background, Pixalytics could see the potential in developing this. I used the PCC’s green space, nature reserve and neighbourhood open data sets and found a way to calculate areas of green space in Plymouth using Landsat/Sentinel 2 satellite data to provide a comparison.

Sentinel-2 Image of Plymouth from 2016. Data courtesy of Copernicus/ESA.

Sentinel-2 Image of Plymouth from 2016. Data courtesy of Copernicus/ESA.

There were a few challenges to overcome in using the multiple PCC data sets as they had different coordinate reference systems, which needed to be consistent to be used in GIS software. For example, the Nature Reserves data set was partly in WGS84 and partly in OSGB 1936. Green space is in WGS 84 and the neighbourhood boundaries are in OSGB 1936. This meant that after importing these data sets in GIS software, they wouldn’t line up. Also, the green space data set didn’t include landmarks such as the disused Plymouth City airport, and large areas around Derriford Hospital and Ernsettle. Using GIS software I then went on to find a way to classify and calculate areas of green space within the Plymouth city boundary. The Sentinel-2 which can be seen above, has a higher spatial resolution and allowed me to include front and back gardens.

My green space map for 2016 created from Sentinel 2 data is the most accurate, and gives a total area of green space within the Plymouth neighbourhood boundary of 43 square kilometres, compared with 28 square kilometres that PCC have designated within their dataset. There are some obvious explainable differences, but it would be interesting to explore this deeper.

My second project was to write computer code for the processing and mosaicking of Landsat Imagery. Pixalytics is developing products where the user can select an area of interest from a global map, and these can cause difficult if the area crosses multiple images. My work was to make these images as continuous as possible, accounting for the differences in radiances.

I ended up developing a Python package, some of whose functions include obtaining the WRS path and row from an inputted Latitude and Longitude, correcting for the difference in radiances, and clipping and merging multiple images. There is also code that helps reduce the visual impact of clouds on individual images by using the quality band of the Landsat 8 product. This project took up most of my time, however I don’t think readers would appreciate, yet alone read a 500 line python script, so this has been left out.

I’d like to take this opportunity to thank Andrew and Samantha for giving me an insight into this niche, and potentially lucrative area of science as it has given me some direction and motivation for the last year of my degree. I hope I’ve provided some useful input to Pixalytics (even if it is just giving Samantha a very long winded Python lesson), because they certainly have done with me!

 

Blog written by:
Miles Lemmer, SPIN Summer Placement student.
BSc. Environmental Physics, University of Reading.