Identifying Urban Sprawl in Plymouth

Map showing urban sprawl over last 25 years in the areas surrounding Plymouth

Map showing urban sprawl over last 25 years in the areas surrounding Plymouth

Nowadays you can answer a wide range of environmental questions yourself using only open source software and free remote sensing satellite data. You do not need to be a researcher and by acquiring a few skills you can the analysis of complex problems at your fingertips. It is amazing.

I’ve been based at Pixalytics in Plymouth, over the last few months, on an ERAMUS+ placement and decided to use Plymouth to look at one of the most problematic environmental issues for planners: Urban Sprawl. It is well known phenomenon within cities, but it can’t be easily seen from ground level – you need to look at it from space.

The pressure of continued population growth, the need for more living space, commercial and economic developments, means that central urban areas tend to expand into low-density, monofunctional and usually car-dependent communities with a high negative ecological impact on fauna and flora associated with massive loss in natural habitats and agricultural areas. This change in how land is used around cities is known urban sprawl.

As a city Plymouth suffered a lot of destruction in World War Two, and there was a lot of building within the city in the 1950s and 1960s. Therefore, I decided to see if Plymouth has suffered from urban sprawl over the last twenty-five years, using open source software and data. The two questions I want to answer are:

  1. Is Plymouth affected by urban sprawl? and
  2. If it is, what are Plymouth’s urban sprawl trends?

1) Is Plymouth affected by urban sprawl?
To answer this question I used the QGIS software to analysis Landsat data from both 1990 and 2015, together with OpenStreetMap data for natural areas for a 15 kilometre area starting from Plymouth’s City Centre.

I then performed a Landscape Evolution analysis, as described in Chapter 9 of the Practical Handbook of Remote Sensing, written by Samantha and Andrew Lavender from Pixalytics. Firstly, I overlaid natural areas onto the map of Plymouth, then added the built up areas from 2015 shown in red and finally added the 1990 built-up areas in grey.

Detailed map showing the key urban sprawl around Plymouth over last 25 years

Detailed map showing the key urban sprawl around Plymouth over last 25 years

The map, which has an accuracy of 80 – 85%, shows you, no major urban development occurred in the city of Plymouth and its surroundings in the last 25 years – this is of course about to change the development of the new town of Sherford on the outskirts of the city.

However, as you can see in the zoomed in version of the map on the right, there is a noticeable urban development visible in the north west of the city and a second in Saltash in Cornwall on the east of the map. The built up area in the 15km area around Plymouth increased by around 15% over the 25 year period. The next question is what are the trends of this sprawl.

2) What are Plymouth urban sprawl trends?
A large amount of research tries to categorize urban sprawl in various types:

  • Compact growth which infill existing urban developments, also known as smart growth, and mainly occurs in planning permitted areas
  • Linear development along main roads
  • Isolated developments into agricultural or wildlife areas in proximity with major roads.

These last two have a bad reputation and are often associated with negative impacts on environment.

Various driving forces are behind these growth types, creating different patterns for cities worldwide. For example, rapid economic development under a liberal planning policy drives population growth in a city which then is expands and incorporates villages located in near or remote proximity over time. This is fragmented approach, and results in a strong land loss.

But this is not the case for Plymouth which in the last 25 years showed a stable development in the extend permitted by planning policies with a predominant infill and compact expansion, a smart growth approach that other cities could take as an example.

These conclusions can be taken following only a few simple steps- taking advantage of free open source software and free data, without extensive experience or training.
This is a proven example of how you can make your own maps at home without investing too much time and money.

This is the end my internship with Pixalytics, and it has been one of my best experiences.

Blog written by Catalin Cimpianu, ERASMUS+ Placement at Pixalytics.

Current Work in Remote Sensing and Photogrammetry

Last week the annual Remote Sensing and Photogrammetry Society (RSPSoc) conference was held in Aberystwyth. Now I’ve stepped down as RSPSoc Chairman I could relax and enjoy this year’s event as a delegate.

Arriving on Wednesday morning, the first session I attended was organised by the Technology and Operational Procedures Special Interest Group (TOPSIG), which was focused on Operational Earth observation. There were a great range of presentations, and I particularly enjoyed the user insights by Andy Wells on how customers are really using imagery. Recent developments in on-the-fly importing, georeferencing and autocorrelation means bringing data together from different sources isn’t a time consuming chore. Users can therefore spend more time analysing data, extracting information and adding value to their organisations or research. In addition, as highlighted by other presentations, open software repositories continue to grow and now include complex algorithms that were once only available to specialists. Finally, Steve Keyworth reminded us that what we do should be seen as a component of the solution rather than the specification; the ultimate aim should be on solving the customer’s problem, which in the current climate is often financially motivated.

Landsat 7 image showing features in the Baltic, data courtesy of ESA

Landsat 7 image showing features in the Baltic, data courtesy of ESA

On Thursday I co-chaired the Water and Marine Environments session alongside Professor Heiko Balzter, on behalf of the Marine Optics Special Interest Group (SIG). My presentation was focused on the European Space Agency (ESA) Landsat archive that’s been acquired via the ESA ground stations. This data is being reprocessed to create a consistent high resolution visible and infrared image dataset combining the three primary sensors used by the series of Landsat satellites; MSS (Multi-spectral Scanner), TM (Thematic Mapper), and ETM+ (Enhanced Thematic Mapper Plus). Although historical Landsat missions are not ideally suited to observing the ocean, due to a low signal-to-noise ratio, features can be clearly seen and the new processing setup means images are being processed over the open ocean.

Mark Danson’s keynote lecture on Friday morning described the application of terrestrial laser scanners to understanding forest structure. He showcased his post PhD research which has led to the development of the Salford Advanced Laser Canopy Analyser, a dual-wavelength full-waveform laser scanner. The presentation also showed the importance of fieldwork in understanding what remote techniques are actually sensing, and in this case included a team of people cutting down example trees and counting every leaf!

Mark also made me feel less guilty that I am still working on a component of my PhD – atmospheric correction. In research your own learning curve, and the scientific process, mean you gain new insights as you understand more, often explaining why answers are not as simple as you might have assumed. It’s one of the reasons why I love doing research.

Overall, I had a great time at RSPSoc, catching up and seeing what’s new in the field. My next conference event is Ocean Optics, in the US, at the end of October where I’ll be discussing citizen science in a marine science context.

Extrapolation is how we predict the future

A blog post by Adam Mrozek, placement student with Pixalytics in October/November 2013.

I grew up in a very small village with very few inhabitants. As my father very often burned the midnight oil, my mother was the only person to look after me. But I lacked for nothing as she was both mother and father to me.

I can still see a vivid image of my mother celebrating St Andrew’s Eve. During this celebration people try to predict the future using a piece of hot wax and a bucket of cold water.

She had dipped the wax in the bucket of water and tried to interpret the shape. It was a common belief that the shape of wax reveals the future.

As I grew older I left all the superstitious beliefs behind. I practiced the subject of mathematics instead. Believe me or not, but it can help us predict the future, at least to some extent.

Let’s consider a set of points collected above a river. Each point represents a different height measured by the satellite.

Maslow's hierarchy

Points

As the satellite orbits our planet, it provides more data.

Maslow's hierarchy

More points over time

Why do points appear in different places? Because the satellite is unable to pass over exactly the same spot twice in succession as it moves around the Earth; see polar orbits.

How is the water height going to change in the future?

I am going to show the simplest approach available. Namely, the High School approach. We can use extrapolation in order to work out this problem.

Each point can be represented by a set of three elements, i.e. x and y (position) together with height.

Let us calculate the average height at every instance in time. Therefore, we will not have to worry about the x and y coordinates any more; we represent every period of time with one value only.

It becomes easy to draw all the points on a single chart. Like this…

Maslow's hierarchy

Extrapolation

We can now extrapolate the graph easily and tell how the height is going to change in the near future.

This is a very simple approach and will only work for small areas and short time periods.

Using this method we can predict how rivers will behave in the future. More advanced methods are used to research various areas of the Congo river letting us know whether the river will flood its surroundings or not.

Whoever controls the data, controls the service!

Like many people last week we watched the US Congress fail to pass the federal budget and shutdown the US Government. Putting aside the ridiculous scenario that the world’s largest country is closed and the financial hardship they’re inflicting on their public service workers; as a small company in the UK who currently don’t work for America or American companies, we didn’t expect to be impacted commercially. We were wrong!

NASA, National Oceanic and Atmospheric Administration (NOAA) and the US Geological Survey (USGS) control a significant number of satellites and the remote sensing data streams from them. The federal shutdown has closed all their websites and associated structures; although NOAA has kept Weather.gov open for critical weather information only. You’d expect no satellites to be launched during this period, you’d probably expect that should anything go wrong it wouldn’t get repaired, you might even know that some data downloading processes require human intervention and could be impacted too. But surely that’s all? Websites can operate quite happily on their own can’t they?

Given the immense size of remote sensing data sets, in the region of multiple terabytes, many academic and commercial organisations download data when they need it; rather than have the cost of massive data storage facilities. This is where the real impact of the federal shutdown bites. These datasets are downloaded from websites which have been closed by the federal shutdown. It’s not that they aren’t being updated, it‘s a total shutdown. The websites simply have a front page stating that due to the lapse in federal funding the websites is not available; some specialist sites are still up if you know where to the find them, but even they say information may be out of date. Also, the Twitter feeds of NASA, NOAA and USGS have stopped tweeting!

Everyone assumes that the data is still being downloaded in the USA, and will be processed and made available once the federal shutdown is resolved. A little delay maybe but no major issue for research, unless of course something has gone wrong and data isn’t being downloaded. Will researchers in the future, have to refer to the 2013 Data Black Hole or the Federal Fault of 13 in their trend analysis?

However, what about time critical applications? Remote sensing is being used to provide services such as flood and disaster monitoring, crop watering and oceanographic applications. How many of those customers, or suppliers, realised that their ability to receive or deliver those services was dependant on the American government? Anyone relying on Landsat or MODIS data downloaded from a US website, are currently becalmed without a data stream. The European MyOcean service is reporting degraded and interrupted ocean colour products due to a lack of spatial coverage.

Companies who want to provide reliable, consistent and dependable remote sensing applications really need to control the data stream alongside the application. This essentially is having your own ground stations to receive data, out of the reach of most organisations.

This week has clearly shown whoever controls the data stream, controls the service. How much of your service pathway do you control?

EO Applications and Visualisation

I’m concluding my trio of conference blogs by focussing on Earth Observation (EO) and visualisation.

Within the data visualization session at ESA’s Living Planet conference Planetary Visions Limited gave a great talk entitled ‘Presenting Data and Telling Stories’. They highlighted the importance of knowing the audience that you’re communicating with, particularly the difference between presenting data to the general public versus fellow scientists.

The key element of communication to remember is that it’s only complete when the recipient understands the message. Scientific figures are often presented using a colour palette, which is applying artificial colours to black and white images (the rainbow palette is shown below, going from low values being purple / dark blue to high values being red). These enables scientists to easily extract values and get a detailed view of the results. Fine for scientists who understand this principle, but for anyone who doesn’t they may not understand the data presented or even worse may draw incorrect conclusions.

Rainbow Colour Palette

Rainbow Colour Palette

To make data widely understood, it needs to be presented in a way that makes it immediately clear what is being shown. This means that sometimes you need to focus only on the overall message, and sacrifice elements of the detail that allow the extraction of values, differences and trends.

Infographics is the on-trend term for the visual representation of complex data in a chart or graphic, so it’s quickly understood. There’s recently been an explosion of software, and websites, offering to create infographics for you. However, infographics aren’t new. A long running example is the London Underground Map; it isn’t a precise map of the tunnels, but it gives everyone the information they need to use the underground.

The key to a good infographic, and other graphs and charts, is keeping it simple. Focus on the information that you want to convey, rather than graphical embellishments. Edward Tufte argued in ‘The Visual Display of Quantitative Information’ that graphics should be assessed in terms of the data-to-ink ratio, such that the data should be presented with the least amount of ink.

Look at your figures and ask yourself whether the axis, grid lines, legends, numbers, borders or any other of the fancy bits in PowerPoint add anything to the image; or do they make it more confusing and less understandable?
Next time you have present, don’t just pull out the presentation you gave to the other conference last month and think it will do. Think about who is going to be listening to you? Will they understand the data as you’ve presented it? If you don’t think they will change it!

Scientists need to think about visualisation of data, as well as the raw data, if it’s going to be understood by more than those who created it. Data becomes information when it’s presented in a context that makes is useful. Always turn your data into information; your audience will thank you!

EO applications and the ESA Living Planet Symposium

Last week’s blog looked at developments in the technology providing Earth Observation (EO); however the industry is evolving and much more attention is now being paid to downstream activities. It’s no longer good enough to get a satellite to collect data, everyone has to think about how applications will, and can, use the data.

At the Living Planet Symposium there were presentations on the applications being developed from European Space Agency’s (ESA) CryoSat-2, which was launched in April 2010; it’s a replacement for Cryosat-1, which was lost due to a launch failure in 2005. CryoSat-2’s main focus is the monitoring of sea ice thickness in the polar oceans and ice sheets over Greenland and Antarctica. During its 3 years of full operation it has witnessed a continuing shrinkage of winter ice volume.

However, the on-board altimeter can also be used for many other applications, for example it doesn’t just acquire data over the polar regions. More interestingly the presenters also showed its potential for mapping coastal waters and inland water bodies with a spatial coverage that’s not possible from current low resolution altimeters.

Freshwater is a scarce resource, 97.5% of the earth’s water is saltwater, and given that almost three quarters of that freshwater is used in agriculture to grow food; the benefits of developing a method for remotely obtaining accurate river/lake water heights with frequent coverage are obvious.

No doubt there will be a variety of new applications developed using this freshwater data over the coming period. However, these applications need to have one eye on the next significant revolution in EO; data visualisation. It’s becoming vital that data is made available in a form that is understandable for non-scientists, and this will be the subject of next weeks blog!

An EO conference roundup: RSPSoc 2013 and the ESA Living Planet Symposium

It’s conference season! I’m at my 2nd conference in 2 weeks, both in Scotland.

Last week was the Remote Sensing & Photogrammetry Society Annual Conference, #RSPSoc2013, hosted in Glasgow. It included a broad range of sessions and scientific output within the ‘family’ atmosphere that you find within societies.

The conference started off with a keynote from Dr. Stewart Walker (BAE Systems and President-Elect of the American Society for Photogrammetry and Remote Sensing)
reviewing the history and innovations in photogrammetry. I was fascinated to find out that in the early days of remote sensing (1960’s) US military satellites ejected cans of photographic film, picked up by aircraft as they fell to Earth, to get high resolution data.

He also showed that since then the number of high resolution optical satellites and the capacity of those satellites to capture information has continuing to increase; in addition to the speed at which an end user can receive captured data. Today Autonomous Unmanned Vehicles (AUVs) have the capability to take very high resolution video that can see objects as small as a songbird.

For me the most incisive comment he made was when he was summarising his own career, where he said that leaders don’t only develop science, but also develop people who develop science. Something worth remembering by every scientific business.

The second keynote was by provided by Craig Clark MBE (Clyde Space), which showcased the growth of the company that is leading the UK Space Agency’s programme to design and launch a cubesat; UKube-1 which is due for launch in December.

Cubesats are small satellites, built in units of 10 cm cubes, with Ukube-1 being 3u i.e. 3 cubes in size (length). These are not the smallest satellites to be launched, but offer the potential to provide scientific quality missions at a much lower cost than conventional satellites; allowing developers to be more innovative with technologies and off the potential for constellation, rather than single, missions. This won’t be the end of conventional larger satellites, as they are still needed for the capture of complex high quality data sets. But these two technologies will give greater flexibility for data capture.

This week I’m at European Space Agency’s Living Planet Symposium http://www.livingplanet2013.org/. Still a ‘family’ atmosphere, but a much larger family with around 1,700 attendees in Edinburgh. The conference has showcased ESA’s historical, current and future missions including SWARM that will be launched in November and the first Copernicus mission (Sentinel 1) that will launch in 2014.

The SWARM constellation (3 satellites) will measure the Earth’s magnetic field which protects us from cosmic radiation and charged particles arriving from the Sun. Whereas Sentinel 1 is a radar mission, which has many different applications as it provides a view of the surface roughness – a rough surface will reflect strongly while a smooth surface will reflect weakly – which is available during the day and night irrespective of cloud cover. Examples include tracking vessel movements at sea, monitoring forests and looking at the growth of mega-cities.

The last week has reminded me that remote sensing and photogrammetry are changing and fast moving fields; new technologies are offering us greater opportunities and flexibilities. But as Dr Walker reminded us, behind all these developments are some amazing people.

Business with social inklings

I recently read Richard Branson’s ‘Screw Business as Usual’ and his idea of ‘Business as a Force for Good’ struck a chord with me. For me a business is more than just financial figures, its ethos and values are as important. This is not about having a lovely glossy brochure, instead how does the business stand up for and demonstrate the values it believes in.

I know not everyone looks at business in this way. Maybe because I never dreamed of running a company, my view of the corporate world is influenced by my academic and scientific background. I transitioned from academia to business not for the pursuit of profit, but because it appeared to be the best way of getting where I wanted; and I’ve always liked a challenge!

From the last 5+ years of running companies I’ve decided that business is just a word, whereas the vision of a business is more personal reflection of those leading it. Whilst I’ve come to appreciate the importance of cashflow and profit to both company stability and growth; how the money is used is still an extension of the company values.

My vision for Pixalytics is for it to be a company that provides information and knowledge to people who need it, without them needing to understand the complexity of the underlying data gathering and processing. This will require a convergence of knowledge around business, research, innovation, science and education to be successful.

These knowledge strands must go beyond the pure financial aspect, they need to be imbedded in our values and ethos. I try personally to demonstrate this by:
• Exploring and learning about how different businesses work.
• Setting time aside to focus on scientific research.
• Writing papers/articles and giving presentations to develop and communicate ideas.
• Volunteering within a number of learned/trade associations.
• Supporting students and new entrepreneurs.

Obviously doing these things takes time, and hence time away from earning money. However, through our authenticity and values, I believe Pixalytics will be stronger and a force for good.

Completing the PhD publication triple

Some great news this week! Dr Susan Kay’s third paper from her PhD has been accepted for publication by Applied Optics. Entitled “Sun glint estimation in marine satellite images: a comparison of results from calculation and radiative transfer modeling”, it nicely shows the impact of choosing different models for the sea surface elevation and slope when predicting sun glint. In response to the notice of publication Sue said “It’s great to see that last bit of PhD work finished. Now I’d better get writing about marine ecosystem modelling!” which is her current research at Plymouth Marine Laboratory.

I’ve been one of Sue’s PhD supervisors, alongside Dr John Hedley, and it’s wonderful that she’s had three papers published. There is always an expectation that PhD students will produce papers during their studies, on top of writing up their PhD. However peer-reviewed publications aren’t easy to achieve, more and more papers are being produced but scientists only have limited time to act as reviewers. The consequence is that journals are tending more towards a straight forward acceptance or rejection, rather than longer supported revisions processes. Over the 20+ years I’ve supervised students, some have published several peer-reviewed papers whilst others have not managed to get one accepted. I never achieved a first-authored one during for my own PhD.

I think the differentiating success factors in getting publications are writing up research that is novel (rather than incremental), maintaining self-belief in your work plus a small measure of luck. Many times has a paper been rejected, only to be accepted by another journal after revisions, but for a PhD student the rejection can be a very disheartening process; especially if it’s their first paper.

Therefore if you get rejected, don’t be down-hearted. Use the valuable reviewer feedback to look at the paper with fresh eyes, and give careful thought on where to submit. A lower-ranked journal can be better for a first PhD submission; especially if the research is still in the initial stages of development. Believe in your work, believe in yourself and send the paper out again and do this over and over until you get it accepted. You never know you might get three papers published like Sue!