Controlling the Space Industry Narrative

The narrative of the satellite industry over the last week had all the components of a blockbuster novel or film: with new adventures beginning, dramatic challenges to overcome, redemption and an emotional end.

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Like lots of good stories, we start with those characters setting off on new adventures. Firstly, China launched its most powerful imaging satellite, Gaofen-2. It carries a High Resolution Optical Imager capable of providing images with a spatial resolution of 80cm in panchromatic mode and 3.2m in multispectral mode, and has a swath width of 48km. It is the second in series of seven Earth observation (EO) satellites, following Gaofen-1 launched in April 2013, which will provide environmental monitoring, disaster management support, urban planning and geographical mapping. The Long March 4B rocket launched Gaofen-2, redeeming itself following a failure last December causing the loss of the CBERS-3 EO satellite. The second significant launch was from the International Space Station on the 19th August, when the first pair from the twenty-eight constellation satellites of Flock 1B were launched; with further pairs sent on the 20th, 21st and 23rd. Flock 1B is part of three earth imaging nanosat constellations from Plant Labs, providing images with a spatial resolution of between 3 – 5m.

ESA’s Galileo satellites, Doresa and Milena, provided the drama by failing to reach their planned altitude of 29.9km, reaching an orbit of 26.9km; in addition, their inclination angle is 49.8 degrees to the equator, rather than 55 degrees. They were the fifth and sixth satellites in Europe’s version of the American GPS satellite navigation system, launched on the Soyuz rocket. Getting the satellites to the correct position is likely to require more fuel than they carry. Like Long March 4B, Soyuz will get its chance of redemption in December with the launch of the next two Galileo satellites.

The Tropical Rainfall Measuring Mission (TRMM), a joint mission between NASA and Japan Aerospace, provides the emotional end to our story with the announcement last week that it had run out of fuel. Launched in 1997, TRMM had a three year life expectancy, but will now provide an incredible nineteen years worth of data. It will continue collection until early 2016, when its instruments will be turned off in preparation for re-entry.

It’s interesting to see how this news has been reported in the mainstream media, little mention of China’s progress, or the second Flock constellation or the amazing longevity of TRMM; instead, the focus was the failure of the Galileo satellites. There is rarely widespread coverage of the successful launches of satellites, but there is a push within the UK for the community to celebrate our successes more so the full range of space activities can be seen.

Earth observation is all about data and images, and whilst these may interest people, it’s only through the power of storytelling that we can describe the positives of the industry motivating and inspiring people. Remember to create stories for your industry, and your company, or someone else will dictate the narrative.

Why understanding spatial resolution is important?

Spatial resolution is a key characteristic in remote sensing, where it’s often used to refer to the size of pixels within an acquired image. However this is a simplification as the detector in the satellite doesn’t see the square suggested by a pixel, but rather it sees an ellipse due to the angle through which the detector receives the signal – known as the instantaneous field of view. The ellipses are turned into square pixels by data processing in creating the image.

The area of the port of Rotterdam shown using a Landsat image (background) at 30m resolution and MERIS full resolution image (inset image) at 300m resolution; data courtesy of the USGS and ESA. Example used within Hydrographic Academy eLearning material.

The area of the port of Rotterdam shown using a Landsat image (background) at 30m resolution and MERIS full resolution image (inset image) at 300m resolution; data courtesy of the USGS and ESA. Example used within Hydrographic Academy eLearning material.

Therefore, for example, when viewing an image with 1km resolution not only will you not be able to see anything that is smaller than 1km in size, but objects needs to be significantly larger than 1km for any detail to be discernable. Whilst this might be fine if you looking at changes in temperature across the Atlantic Ocean, it won’t be much use if you are interested in suspended sediment blooms at the mouth of a small river.

Any image with a spatial resolution of between 50m and 1km, is described as having low spatial resolution. For example, MODIS operates low spatial resolutions ranging from 250m to 1000m as the primary focus is global mapping rather than capturing detailed imagery for local regions.

If you want to look for smaller objects, you’ll need use images with medium spatial resolutions of between 4m to 50m. There is quite a lot of freely available imagery within this range. For example, NASA’s Landsat 8 operates at 15, 30m and 100m resolution and ESA’s Sentinel-1A operates at the three resolutions of 5m, 20m and 100m. If you want go even finer, you will require high spatial resolution images that go down to resolutions of between 4m and 1m, or very high spatial resolution images which cover the 0.5m – 1m range. Commercial organisations tend to operate satellites with these higher levels of resolution, and they charge for making the images available. It’s likely that military satellites offer imagery down to 0.15m, but there are regulations in place to prevent the sale of extremely high resolution imagery as it’s considered to be a potential danger to security.

Spatial resolution was in the headlines last week with launch of the DigitalGlobe’s WorldView-3 satellite that can produce spectral images with a resolution down to 0.31m. Technologies to produce images at this resolution have been around for some time, but as reported by Reuters in June, DigitialGlobe has only recently received a license from the US Commerce Department to start selling images with a resolution of up to 0.25m; without this licence they wouldn’t be able to sell this higher resolution imagery.

Regulatory involvement in very high resolution imagery was also demonstrated earlier this year, when in January, the UK government blocked the European Commission’s effort to set common European regulations on the sale of high-resolution satellite imagery. The UK government currently controls access to data through export licencing conditions on the satellite hardware, and they felt regulations would impact on UK’s ability to export space technology.

Therefore, spatial resolution is an important term, and one every remote sensing client should understand. Different services require different spatial resolutions, and selecting the most appropriate resolution for your needs will not only ensure that you get exactly what you want, but could also save you money as you don’t want to over-specify.

Rosetta: Extra-terrestrial Observation

Full-frame NAVCAM image taken on 9 August 2014 from a distance of about 99 km from comet 67P/Churyumov-Gerasimenko. Image: ESA/Rosetta/NAVCAM

Full-frame NAVCAM image taken on 9 August 2014 from a distance of about 99 km from comet 67P/Churyumov-Gerasimenko. Image: ESA/Rosetta/NAVCAM

Most people will have seen last week’s news about ESA’s Rosetta spacecraft arriving at comet 67P/Churyumov-Gerasimenko and the animated images of the ‘rubber-duck’ shaped object taken from Navigation Camera (NavCam), part of Rosetta’s Attitude and Orbital Control System. The arrival generated many headlines, from the 10 years it took to catch the comet, through the history making first rendezvous and comet orbit, to the final part of the mission and the intention to land on the comet. However there was little detail about the remote sensing aspect of the mission, which we feel is a missed opportunity as it’s using many of the techniques and methodologies employed in Earth observation (EO).

The orbiter part of Rosetta carries eleven different remote sensing experiments with a wide variety of sensors gathering data about the comet before the lander touches down. Amongst the instruments on-board are three separate spectrometers; a visible and infrared thermal imaging spectrometer (VIRTIS) focussing on temperature and geography; an ultraviolet imaging spectrometer (ALICE) looking at gases and the production of water and carbon dioxide/monoxide; and finally ROSINA has sensors for measuring the composition of the comet’s atmosphere and ionosphere.

The VIRTIS instrument has two channels; the VIRTIS-H channel is a high spectral resolution mapper operating from 2 to 5µm, whereas the VIRTIS-M is the mapper operates at a coarser spectral resolution and one of its main products will be a global spectral map of the comet’s nucleus. This instrument has already been used to undertake measurements of Earth. In November 2009, on Rosetta’s third Earth fly-by, VIRTIS measurements were compared to existing EO instruments from ENVISAT/AATSR, SCIAMACHY and MODIS. Overall, there was a strong correlation with the EO data, but differences were also seen – especially in the 1.4µm water absorption feature.

VIRTIS has a key role in supporting the selection of the November’ landing site, a task that has become more difficult now the comet has been imaged in detail and is seen to have a complex shape. In addition, recent VIRTIS measurements have shown the comet’s average surface temperature to be around minus seventy degrees centigrade, which means the comet is likely to be too warm to be ice covered and instead must have a dark, dusty crust.

Remote sensing is playing a huge part in the Rosetta mission and it should be celebrated that these instruments will gather data over the next eighteen months to help scientists determine the role comets play in the evolution of planets. It will be amazing if remote sensing techniques developed to explore, monitor and analyse our planet, will be the same techniques that help determine if the water on Earth originally came from comets.

America’s Roadmap for Earth Observations

Have you all been keeping up with your reading of policy documents issued by the Executive Office of the President of the United States? If not, you may have missed their National Plan for Civil Earth Observations (EO), issued a couple of weeks ago. Given the US Government is the largest provider of EO data in the world this is important for everyone working in the field, particularly as it estimates that EO activities are worth $30 billion to the US economy.

The National Plan builds on the US National Strategy for Civil Earth Observations issued in 2013; such national Earth observations strategies aren’t unusual, the UK has issued two in recent years with the UK Space Agency Earth Observation Strategy in October 2013 and the Department of Energy and Climate Change Earth Observation Strategy in June 2012. However, what makes the National Plan more interesting, and valuable, is that it ranks US priorities for civil EO together with the actions they intend to take to deliver them.

Landsat 8 showing London, data courtesy of the USGS

Landsat 8 showing London, data courtesy of the USGS

The plan identifies five priorities, with the top two focussing on achieving continuity of long-term sustained EO. The number one priority is to maintain observations considered vital to public safety; national economic and security interests; and critical to scientific research; this includes the continuity of Landsat multispectral information, the GPS network and a variety of weather, land and ocean measurements. Second priority is observations focussing on changes in climate, greenhouse gases, biodiversity and ecosystems often in collaboration with international partners. The third priority surrounds short-term experimental observations of less than seven years duration, such as measurements for specific scientific research, first-of-their-kind observations, innovations and proof of concept work. The final two priorities are around improvements to service-life extensions; and the assessment, and prioritization, of EO systems.

  1. Whilst the priorities are interesting, far more interesting, and valuable, are the eight actions the US Government intends to take to deliver these priorities:
    Increase the integration of EO data, and making data available to everyone irrespective of the original purpose. By eliminating the silo approach to data, it will offer greater potential for innovative research.
  2. Implement the Big Earth Data Initiative (BEDI) to provide uniform methodologies and practices for the handling of EO data to enable a wider group of users, without specialist knowledge, to find, obtain, evaluate, understand, compare and use new and legacy data.
  3. Increase efficiency and cost savings through streamlining processes, coordinated acquisition of data, cooperation and collaborative working with commercial and non- US owned satellites.
  4. Improve spatial resolution, temporal cycle, sample density and geographic coverage of observation networks with both new observation systems and technical upgrades.
  5. Maintain the physical, computing, communication and human infrastructure required to deliver EO.
  6. Encourage private companies to invest in the space sector. However, it makes clear that it intends to maintain the principles of open data sharing which will make it interesting to see how, and where, private firms will get returns on their investments.
  7. Continuing to work with other international bodies and space agencies to provide access to greater EO data and supporting collaborative research.
  8. Using citizen science, crowdsourcing and private sector initiatives to leverage EO data innovations.

The National Plan is a detailed document and it will be interesting to see the UK Space Agency, or perhaps the European Space Agency, version. Any EO business working in, or with firms in, the US needs to begin planning for these developments. Would does your business need to do to reposition your core competences, skill base or infrastructure to be able to exploit these opportunities? Even if you don’t currently work in the US take note, the journey outlined will impact the EO community.

Looking Deeper At Phytoplankton from Space

NASA is currently in the middle of a joint airborne and sea campaign to study the ocean and atmosphere in preparation for developing instruments for future spaceborne missions. The Ship-Aircraft Bio-Optical Research (SABOR) campaign has brought together experts from a variety of disciples to focus on the issue of the polarization of light in the ocean; it runs from 17th July to 7th August and will co-ordinate ocean measurements with overflights.

One of the instruments on SABOR is an airborne Lidar-Polarimeter aimed at overcoming the limitation of vertically integrated surface measurements as captured by many existing Earth Observation satellites. These traditional satellites measure the water-leaving radiance, which is the signal returned from an area of water; the problem is that the signal is returned from a variety of different depths and it’s then aggregated to provide a single vertically integrated measurement for that area.

Diffuse attenuation depth at 490 nm, Kd(490), created from the SeaWiFS mission climatological data; data products retrieved from http://oceancolor.gsfc.nasa.gov/

Diffuse attenuation depth at 490 nm, Kd(490), created from the SeaWiFS mission climatological data; data products retrieved from http://oceancolor.gsfc.nasa.gov/

In effect, this means that a phytoplankon bloom at the surface will show up as a strong concentration on an image, however the same bloom at a deeper depth will show as having lower concentrations. The figure on the right shows the diffuse attenuation depth at 490 nm, blue light, created from the SeaWiFS mission climatological data collected between 1997 and 2010; the higher the value the shallower the depth of maximum passive light penetration. So, in summary, the light penetrates further within the open ocean than in many coastal waters that are more turbid.

The SABOR Lidar is based on lasers and will provide depth-resolved profiles, so instead of having a single value for an area of water, the measurements will be separable for different depths; expected to penetrate to around 50m. This will enable a much more detailed analysis of what’s happening within the water column. Satellite Lidar measurements have already been used to provide initial insights into the scattering of light resulting from phytoplankton through the CALIPSO satellite, an atmospheric focused Lidar mission launched in 2006.

In addition, the polarimeter element of SABOR will improve the quantification of the in-water constituents, such as the concentration of Chlorophyll-a (the primary pigment in most phytoplankton as well as land based plants) plus an understanding of the marine aerosols and clouds. Polarimeters have been launched before with the POLDER/PARASOL missions being examples.

The SABOR campaign will provide valuable information to support a proposal to have an Ocean Profiling Atmospheric Lidar (OPAL) deployed from the International Space Station (ISS) in 2015. If successful, it will join the existing Earth Observation mission on the ISS, called the Hyperspectral Imager for the Coastal Ocean (HICO), which I discussed in an earlier blog.

The potential offered by depth profiled oceanic measurements is exciting and will offer much more granularity beyond the ocean’s surface. I’m looking forward to the campaign’s results.

Random Numbers from Space

The concept of randomness, and creation of random numbers, has been part of human culture for thousands of years; in fifth century Athens, they considered elections undemocratic, everyone was considered equal and they selected people at random from the population to serve as the government. Perhaps our current politicians should take note, although the principle itself still exists in the UK through jury duty selection.

Random numbers are integral to modern society, from the obvious betting and gambling arenas, to sport, science, the arts and cryptography – all those little devices used to log into bank accounts are based on random numbers; in addition, they’re key to satellite communication systems.

Computerised random number generators have been around as long as programmers have programmed, and their algorithms produce a series of numbers that look random, but in fact they aren’t as they have a predetermined sequence. These are known as pseudo random numbers and are fine for many uses, but aren’t suitable to applications like secure communications or cryptography; for these we need to create true random numbers.

Lightning, Copyright: Taiga / 123RF Stock Photo

Lightning, Copyright: Taiga / 123RF Stock Photo

A true random number is one whose outcome is unpredictable, for example rolling a dice. Whilst this works for a single true random number, what if you want thousands or millions? Building a machine to throw millions of dice simultaneously isn’t sensible, instead random numbers are created using a physical property of the environment applied through a computer, for example decays in radioactive sources, snapshots of lava lamps or atmospheric noise caused by lightning strikes within thunderstorms. Last Thursday night would have been a goldmine to anyone using this methodology, as over 3,000 lightning strikes hit the country within three hours.

The space sector is now becoming involved in this area. In last week’s blog we reported on the two UK satellites recently launched; the UKube-1, built by Clyde Space in Glasgow, carries a true random number generator. The JANUS experiment will test the feasibility of using cosmic radiation to create true random numbers by detecting impacts from space particles through the single event upset effect methodology.

This could offer an alternative method of creating high volumes of random numbers for the communication and cryptography industries particularly, and gives one more way in which space can help.

Blog written in conjunction with Adam Mrozek, work placement student.

How Many Earth Observation Satellites are in Space?

Space is growing market! With Google recently announcing its purchase of Skybox Imaging, the myriad of organisations jostling to be the first to offer commercial space flights and the launch of two UK satellites last week(TechDemoSat-1 and UKube-1) it’s clear that space is becoming an increasingly congested market place.

Artist's rendition of a satellite - paulfleet/123RF Stock Photo

Artist’s rendition of a satellite – paulfleet/123RF Stock Photo

Have you ever wondered about the Earth Observation (EO) market? Who owns and controls the EO satellites you use? I’m sure you know the big names such as the US Government controlling Landsat, ESA’s recent launch of Sentinel 1-A, and so on, but what about the rest? In a recent blog, we used data from the Union of Concerned Scientists (UCS) and the United Nations Office for Outer Space Affairs (UNOOSA) to calculate there are currently 3,921 satellites orbiting the Earth; of which 1,167 are active. Today we’re focusing on the EO fleet, and for EO we’re going to count any satellite whose purpose is defined as EO, remote sensing, earth science or meteorology – it’s acknowledged that some satellites have more than one purpose.

According to the UCS database, at the end of January 2014, there were 192 EO satellites, the oldest of which is a Brazilian meteorology/EO satellite, SCD-1, launched in 1993. There are 45 nations/organisations with EO satellites in space and in terms of numerical supremacy, it’s a neck and neck race between China and the USA; China controls 25.5% of the fleet compared to USA’s 23.5% – although just over a third of the USA’ s fleet were jointly launched with other countries. After the front-runners, India has 7.29%, followed by Germany with 4.69% and Russia with 3.65%.

The picture of control becomes more interesting when you look at the four user groups for this EO fleet:

  • 56.77% are listed as used by Governments
  • 25.63% are listed as military satellites
  • 6.25% are commercial satellites
  • 4.17% are listed as being for civil uses; and
  • the remaining 7.18% are listed as being shared between two of the four user groups.

However, the space landscape is changing rapidly. Since the UCS database was updated there have been over 130 satellites launched; which have been dominated by Cubesats. The cheaper costs of Cubesats have removed a significant barrier to entry for new players to space; and we’ll see more commercial organisations becoming interested in space, like Google, and countries who traditionally haven’t had a presence in space getting a foothold. In addition, governments will be looking to launch satellites to build up their own space industry, something the UK has been focussing on for the last couple of years.

This changing environment will affect everyone working in the EO industry, particularly those in downstream activities, as there will be an increased number of datasets. Downstream companies will need to secure access to the new data to ensure they stay ahead of their competitors, and in a more commercial marketplace, this will almost certainly involve a cost. Strategic partnerships are going to become increasingly important in the EO world; and so don’t get left behind, start horizon scanning now and see where you need to position your company.

Remote Sensing Big Data: Possibilities and dangers

Remote sensing is an industry riding the crest of the big data wave. It offers great opportunities to those that can harness the power, but it’s also fraught with dangers. Big data is a blanket term used to describe datasets that are large and complex, due to the quantity of data, the speed at which new data becomes available or the variety of data. Remote sensing ticks all three of these boxes!

Sentinel-1 Netherlands

Sentinel-1 image of the coast of the Netherlands; courtesy of ESA

When I first started working with remote sensing, I approached the IT department to ask for 100 megabytes of disk space for my undergraduate project and was told nobody ever needs that much storage! Currently, the amount of Earth observation data available to the community is growing exponentially. To give you some examples, the recently launched Copernicus Sentinel 1-A satellite collects around 1.7 terabytes of data daily, the number of daily images collected by Landsat 8 has been increased by 18% this month and DigitalGlobe estimates it captures two petabytes of data each year. This quantity of data gives two key challenges; firstly, where to store it? Secondly, how do you know what data is valuable to enhance your decision-making?

It’s assumed the storage issue has been resolved by cloud computing, but there is a cost for getting the data to, and from, the cloud. An interesting recent study by the University of British Columbia discovered that over 80% of scientific data is lost within 20 years, mostly due to obsolete storage devices and email addresses. I have first-hand experiences of this. My PhD data was stored on hundreds of floppy disks and when I came to use them recently most didn’t work; fortunately I have a zip drive backup – although I still need to work out how to read Quattro Pro spreadsheets! I also have several Sun workstations with associated data on tapes which will only read from the machines they were written on; so how much of this data is accessible is debatable.

How often do you think about your old and archived data? Take a moment to consider how, and where, your critical data is stored. Is all of your data available and accessible? When was the last time the back-up procedures for your scientific or business data were tested? Does your IT department know which email addresses are critical for the receipt of satellite data?

The second challenge is knowing what data to use, particularly for people new to remote sensing. There is free data, paid for data, various satellites, various data types, various formats and the list can go on. The remote sensing community needs to help by providing more bridges between the data and the user community. The datasets available can offer huge benefits for business and science, but if people have to spend hours hunting round and trying to find the right image for them, they won’t stay users for long.

You can hire remote sensing companies, like us, who can offer impartial advice to help you select the right information. Pixalytics is striving to find more ways to make data more available, more accessible and more understandable. Remote sensing data belongs to everyone, and we need to support users to get it.

Is the Dead Sea in terminal decline?

The Dead Sea is located in the Jordan rift valley with a Jordanian border to the east and borders with Israel and Palestine to the west, and it has the lowest land elevation on Earth at over 400 metres below sea level.

Technically the Dead Sea isn’t a sea at all, it’s a hypersaline lake; the freshwater inflows from the Jordan River, and its associated tributaries, come into the land-locked Dead Sea where some of the water evaporates. This increases the waters salinity, it’s about nine and half times saltier than the ocean, and famously means people can happily float on it simply through natural buoyancy.

Despite having a surface area of almost four hundred square miles the Dead Sea is estimated to be shrinking at around one metre per year. Water levels have declined by approximately 30 metres since 1960 through to a combination increased usage and extraction upstream in the River Jordan, and a reduction in rainfall.

Dead Sea imagery from Landsat 5 in 1984 on the left, and Landsat 8 in 2014 on the right. Data courtesy of USGS.

Dead Sea imagery from Landsat 5 in 1984 on the left, and Landsat 8 in 2014 on the right. Data courtesy of USGS.

The changing shape of the Dead Sea can be seen in the Landsat image shown on the right which were both taken in the May/June period, but 30 years apart, the first in 1984 and the second in 2014. The shows a reduction in surface area alongside the development of evaporation ponds in the south that are maintained by pumping water from the northern basin – the Al-Lisān peninsula splits the sea into two unequal basins with northern basin being significantly larger and deeper. We also wondered if the recent 2014 image shows spontaneous crystallization in the surface waters, as described by Steinhorn in 1983. Could be worth further investigation?

A talk at the recent EARSeL conference also highlighted the problem of sink holes around the Dead Sea. Sink holes are being caused by the interaction of freshwater with subterranean salt layers; as the water level drops, salt is left behind in the soil and when freshwater washes through the soil the salts dissolve and cavities are created. Eventually, the subterranean structure loses integrity and sinkholes are formed. It’s estimates that there are around 3,000 sink holes in the Dead Sea region, but more worryingly a new sink hole opens up almost every single day!

The Dead Sea is one of the natural wonders of the world, and yet it’s slowing dying through shrinkage and sink holes. Remote sensing is a great monitoring tool for natural resources, but something needs to be done on ground if these images aren’t to become a modern day death mask.

34th EARSeL Symposium

Last week I attended the 34th Symposium of the European Association of Remote Sensing Laboratories, known as EARSeL, in Warsaw, Poland. Originally formed in 1977, EARSeL is a scientific network of academic and commercial remote sensing organisations. It aims include:

  • promoting education and training related to remote sensing and specifically Earth Observation (EO),
  • undertaking joint research projects on the use, and application, of remote sensing,
  • providing governmental, and non-governmental organisations, with a network of remote sensing experts.
EARSeL Bureau Handover Warsaw 2014

EARSeL Bureau Handover
Warsaw 2014

EARSeL is run by a Council of elected national representatives and an executive Bureau, elected by the Council. For the last year I have been proud to serve on the EARSeL executive Bureau as Treasurer for the organisation.  My term of office finished at the symposium, and I’d like to wish the new Bureau a successful year.

In addition I was also the co-chair and presenter for the Oceans & Coastal Zones session on the Monday afternoon and on the Wednesday I taught a session on ‘Introduction to optical data processing with BEAM’ as part of the joint EARSeL & ISPRS (International Society for Photogrammetry and Remote Sensing) Young Scientist Days which ran alongside the symposium.

For me the promotion of science generally, and specifically Earth Observation (EO), is an integral part of running Pixalytics. I want to support more people to understand and get involved; in particular, it’s vital that we educate and inspire the early career, and next generation, scientists.

It’s for these reasons that I enjoy working with, and being part of, organisations that are working to inform, educate and promote similar scientific aims. As well as EARSeL treasurer, I was also the Chair of the UK Remote Sensing and Photogrammetry Society (RSPSoc) for three years, and I’m currently vice-chairman of the British Association of Remote Sensing Companies (BARSC).

It can be challenging to balance the income earning side of Pixalytics with the volunteering side, but it’s worth it. There is a real case for businesses getting their employees to volunteer to support work outside of the company, whether it’s industry promotion, teaching or helping support social issues in the local community. Aside from the obvious support for the cause they are volunteering for, it can also help develop skills in time management, decision-making and leadership.

I’ve learnt a huge amount working with the different organisations, as well as developing skills I’ve met people outside my specialism and have strengthened by business network.  I have no intention of stopping volunteering, and I’ve always got one eye out for new opportunities. Volunteering can add value to your company, however large or small, and I’d recommend all organisations should consider the opportunities this could provide for them and their employees.