PyTrx published in Frontiers in Earth Science

Frontiers in Earth Science recently published our work on PyTrx, the Python toolset developed during my PhD for processing oblique time-lapse imagery of glacial environments. The toolset is freely available via pip install and GitHub, and this paper serves as its companion piece to inform users of its capabilities and applications.

Ice velocities from Kronebreen

Figure 5 from our Frontiers publication, demonstrating PyTrx’s dense feature-tracking and georectification capabilities. Velocities were determined from oblique time-lapse image pairs between 14 June and 7 July 2014 at Kronebreen, a tidewater glacier in Svalbard. Templates (represented here as points) were defined over a 50×50 m grid, which were matched between image pairs using normalised cross-correlation and filtered by correlation (i.e. templates were retained where the correlation of the match was above 0.8). The sequence shows an early season speed-up at the terminus of the glacier, where velocities increase from an average of 2.5 m/day (14-16 June, first panel) to 4.7 m/day (5-7 July, last panel).

PyTrx came about as I wanted to derive measurements from time-lapse imagery, which I had collected from Kronebreen and Tunabreen, two tidewater glaciers in Svalbard. However, I couldn’t find any openly available toolset that met my needs. There are a handful of toolsets for processing ice velocities from time-lapse imagery (see ImGRAFT, Pointcatcher and EMT – three great examples), but I wanted to also process other types of measurements, such as meltwater plume footprint extents, supraglacial lake areas, and changes in terminus position. Additionally, most toolsets that I came across were programmed in a limited range of programming languages, mainly Matlab, and I felt there was a need for a toolset in an open-source programming language for those who wanted an alternative that did not rely upon a licensed software.

We set about making PyTrx just for our own processing needs at first, programmed in Python and largely utilising OpenCV, a Python package that handles complex computer vision operations on optical imagery. Before long, we realised there was a need for this toolset in the public domain, with growing interest from others; hence why we began focusing on finalising PyTrx as an operational package that anyone could use.

Delineating meltwater plume footprints using PyTrx

Figure 8 from our Frontiers publication, showing changes in meltwater plume extent distinguished from time-lapse imagery of Kronebreen using PyTrx. The surface expression of the meltwater plume has been tracked through images captured on 5 July 2014 at 18:00 (A), 20:00 (B), and 22:00 (C) to demonstrate part of its diurnal recession. Each plot shows the plume definition in the image plane (top) and its translation to real-world coordinates and plotted onto a coinciding Landsat 8 satellite image (bottom).

PyTrx has been developed with object-oriented design, meaning that the core functions are wrapped in callable-objects, which make it accessible to beginners in programming whilst also serving those more experienced. The following main functions can be performed with the toolset: dense template matching and sparse Optical Flow methods for deriving velocities, automated detection of area features (such as supraglacial lakes), manual delineation of area and line features (e.g. meltwater plume footprints, terminus position), and camera calibration and optimisation for refining the georectification of measurements from the images.

There were many stumbling blocks when it came to publishing PyTrx. I struggled as the feedback, although positive, was a large undertaking that made me question PyTrx’s worth and I doubted my own capabilities in delivering a sound toolset to the glaciology community. Overall though, the review process brought about big changes to PyTrx, which were absolutely essential to improving and finalising the toolset. I have a lot to thank for the review process, without which the toolset would not have reached its full potential and be what it is today.

Terminus lines derived with PyTrx

Figure 9 from our Frontiers publication, demonstrating PyTrx’s ability in extracting terminus profiles from Tunabreen, a tidewater glaciers in Svalbard. Terminus lines were manually traced from sequential time-lapse images, and subsequently georectified to provide a record of terminus retreat.


PyTrx publication in Frontiers – our paper, describing the toolset and its applications using time-lapse imagery of tidewater glaciers in Svalbard

PyTrx GitHub repository – where PyTrx can be downloaded from (the master branch is where the raw scripts are, whilst the distribution branch holds the package files and readthedocs materials

PyTrx readthedocs – PyTrx guide and code documentation

PyTrx on PyPI – PyTrx’s package distribution via pip

Making a PyPI package

Recently, I have had a paper accepted which presents PyTrx, a new Python toolset for use in glacial photogrammetry. Over the course of getting this published, it has been suggested by co-authors and reviewers alike to use a package manager for easy download and implementation of PyTrx. I therefore wanted to package the toolset up for distribution via PyPI (‘pip’), thus making is easily accessible to other Python users with the simple command pip install pytrx. Whilst I found the tutorials online informative, there were some pitfalls which I found hard to solve with the given information. So here is an account of how I got my package on PyPI. The associated files for the PyTrx package are available on a branch of PyTrx’s GitHub repository, if you want to see this walkthrough in action.

Defining the package files

First and foremost, the file structure of the toolset is crucial to it being packaged up correctly. The top directory should contain a folder containing your package, and several other files containing the necessary setup information:

   - PyTrx
   - LICENSE.txt

This is one of the first slip-ups I made, putting all my toolset scripts in the tol directory rather than a folder of its own. If the Python scripts that make your package are not placed in their own folder then they will not be found when it comes to compiling the package.

So let’s go through each of these elements, beginning with the folder that contains the Python scripts we wish to turn into a PyPI package. An initialisation file needs to be created in this folder in order to import the directory as a package. This is simply an empty Python script called, so our folder structure will look a bit like this now:

   - PyTrx
   - LICENSE.txt

Moving on to the LICENSE.txt file, it is important to define a license with any Python package that is publicly distributed in order to inform the user how your package can be used. This can simply be a text file containing a copied license. A straightforward and popular license for distributing code is the MIT license which allows code to be used and adapted with appropriate credit, but there are greate guides for choosing a license appropriate for you online (e.g. This file has to be called ‘license’ or ‘licence’ (uppercase or lowercase) so that it is recognised when it comes to compiling the package.

Similarly with the file, this has to be called ‘readme’ specifically so that it is recognised when it comes to compiling the package. This file contains a long description of the Python package. It might be the case that you already have a file if you have hosted your scripts on GitHub, in which case you can merely adopt this as your readme. Just remember that this should be hardcoded in HTML code, and the readme file will form the main description of your package that people will read when they navigate to the package’s PyPI webpage.

And finally the The setup file is probably the trickiest file to define, but the most important as here we outline all of the metadata associated with our Python package; including the package’s recognised pip name (i.e. the one used in the command pip install NAME), its version, author and contact details, keywords, short package description, and dependencies. Here is PyTrx’s file to serve as an example:

import setuptools

with open("", "r") as fh:
    long_description =

    author="Penelope How",
    description="An object-oriented toolset for calculating velocities, surface areas and distances from oblique imagery of glacial environments",
    keywords="glaciology photogrammetry time-lapse",
        "Programming Language :: Python :: 3",
        "License :: OSI Approved :: MIT License",
        "Development Status :: 5 - Production/Stable",
        "Intended Audience :: Science/Research",
        "Natural Language :: English",
        "Operating System :: OS Independent",
    install_requires=['glob2', 'matplotlib', 'numpy', 'opencv-python>=3', 'pillow', 'scipy'],

Most of the variables are straightforward to adapt for your own package file. The ones to watch out for are the classifiers variable where metadata flags are defined, and the install_requires variable where the package’s dependencies are. PyPI offers a good resource that lists all of the possible classifiers you can add to the classifiers variable.

Finding out how to define dependencies was a little trickier though, as the main PyPI tutorial does not address this. This page gave a brief outline of how to define them with the install_requires variable, but I found that I still had problems in the subsequent steps with package incompatibilities. My main problem was that I had largely worked with conda rather than pip for managing my Python packages, so there were a number of discrepancies between the two in configuring dependencies with PyTrx. My main challenge was finding a balance with OpenCV and GDAL, two notoriously difficult packages to find compatible versions for – I had managed this with conda, finding two specific versions of these packages to configure a working environment. In pip, I found this proved much harder. The package versions used in conda were not the same for pip, and there wasn’t an official repository for OpenCV, only an unofficial repository called opencv-python. We’ll learn more about testing dependency set-ups a bit later on, but for now, just be aware to check that each PyPI package dependency is available and use the >= or <= to define if the package needs to be above or below a certain version. It is generally advised not to pin a dependency to a specific version (i.e. ==), I guess because it reduces the flexibility of the package installation for users.

Generating the distribution files

Once we have all of our files, we can now compile our package and generate the distribution files that will be eventually uploading to TestPyPI and PyPI. It is advised to use TestPyPI to test your package distribution before doing the real deal on PyPI, and I found it incredibly useful as an apprehensive first-time uploader.

If you do decide to test your package on TestPyPI, it is good etiquette to change the name of your package (defined in to something very unique – there are many test packages on TestPyPI, and although they delete test packages on a regular basis, there are plenty of package names that yours could clash with. In the case of PyTrx, I defined the package name as pytrxhow (the package name with my surname), that way there was no chance of using a name that had already been taken. Additionally, you should take your dependencies out of the file as often the same packages do not exist on TestPyPI and therefore are not an accurate reflection of how your package dependencies will look on PyPI.

To generate the distribution files, two packages need to be installed into your Python environment, setup-tools and wheel. I already had versions of these packages in my conda environment, but I updated them using the same command (in Anaconda Prompt) as if I wanted to install them:

conda install setup-tools wheel

After these are installed, navigate to the directory where all of your files are (i.e. in master_folder) using the cd command, and run the following command to build your distribution files for TestPyPI:

python3 sdist bdist_wheel

This should generate two folders containing files that look something like this:

   - PyTrx
   - LICENSE.txt
   - dist
       - pytrx-1.1.0-py3-none-any.whl
       - pytrx-1.1.0.tar.gz
   - pytrx.egg-info
       - PKG-INFO
       - SOURCES.txt	
       - dependency_links.txt	
       - requires.txt	
       - top_level.txt

The dist and egg-info folder should contain all of the information inputted into the file, so it’s a good idea to check through these to see if the files are populated correctly. The SOURCES.txt file should contain a list of the paths to all of the relevant files for making your packages. If you have taken out your dependencies, then the requires.txt file should be empty.

Testing the distribution

There are two ways to test that the distribution files work: 1. using TestPyPI to trial the distribution and the ‘look’ of the PyPI entry, and 2. using the file to test the package installation in your local environment (including dependency solving). Beginning with the test on TestPyPI, start by creating an account on TestPyPI and creating an API token, so you can securely upload the package (there is a great set of instructions for doing this here). Make sure to write down all of the information associated with the token as you will not be able to see it again.

Next, make sure that you have an up-to-date version of the twine package in your environment. Twine is a Python package primarily for uploading packages, which can easily installed/upgraded in a conda environment with the following command:

conda install twine

Now, Twine can be used to facilitate the upload of your package to TestPyPI with this command (making sure that you are still in your master_folder directory:

python3 -m twine upload --repository-url dist/*

Once the command has run, there will be a link to your TestPyPI repository at the bottom which you can click on to take you to it. You can use this to test install your package with no dependencies. In the case of PyTrx (pytrxhow, my test version), this could be done with the following command (just change ‘pytrxhow’ to specify a different package):

pip install -i pytrxhow 

This is all well and good for testing how a package looks on PyPI and testing it can install, however, I was more anxious about the package dependencies knowing the issues I had with OpenCV and GDAL previously in my conda environment. After checking your TestPyPI installation (and this may take a few tries, updating the version number every time), put your dependencies back into your file, run the distribution file generation again, and test the dependency configuration with the following command that will attempt to install your package locally:

python develop

This may take some time to run, but should give you an idea as to whether the dependencies can be resolved. I cloned my base conda environment in order to do this, giving a (relatively) blank environment to run off, and tested the installation by attempting to import the newly installed package in Spyder.

I found that I could not solve the environment, no matter what I specified in, and therefore had to play around with which package was causing the majority of the problems. I found that GDAL was the main cause of PyTrx unsuccessfully installing, so took it out of my dependencies, instead opting to install it after with conda. This seems to work much better, and although may not be a perfect solution, it will create fewer problems for users.

Uploading the distribution to PyPI

So at this point you should feel confident in the look and feel of your package, and its installation in your environment. Before proceeding with the final steps, just run through the following checklist to make sure you have everything:

  • Check that all the information is correct in the, changing the name (e.g. ‘pytrxhow’ to ‘pytrx’) and dependencies if you have been uploading to PyPI previously
  • If you change anything in the file, then run the distribution file generation again
  • Check your TestPyPI page to make sure all the information uploaded is correct and nothing is missing
  • Check on PyPI that there is no other package with the same name as yours

A thorough check is needed at this stage because an upload to PyPI cannot be changed. Further package versions can be uploaded if there is a major problem, but versions that you have uploaded cannot be edited or altered. Therefore it is best to try and get it right the first time. No pressure!

For uploading to PyPI, you need to create an account on PyPI. This account creation is separate to TestPyPI, so another username and password unfortunately. Again, create an API token in the same manner as done previously with TestPyPI, making sure to write down all of the details associated with it. To upload your package to PyPI, we are using Twine again and the following command:

twine upload dist/*

Once run, there will be a link to click through to your PyPI page and voila, your package is online and easy for anyone to download with the old classic command (in the case of PyTrx):

pip install pytrx

In the case of PyTrx, our PyPI page is available to view here, and our GitHub repository contains all of PyTrx’s scripts and the distribution files used for the PyPI upload, which might be useful for some. Hope this helps someone who has suffered any of the pitfalls of PyPI packages! 

Icebergs in Nuuk

Useful resources:

A broad overview and use of Test PyPI

Uploading to PyPI

More information about specifying dependencies and testing package installations

More information about PyPI classifiers

PyTrx’s PyPI page, GitHub repository, and publication

Moving to Greenland

It’s been a while since I used this platform to post an update. A lot has changed in the past year. For one, I now live in Nuuk, the capital city of Greenland. Greenland (or Kalaallit Nunaat in Greenlandic) is a self-governing region within the Kingdom of Denmark, having been granted self-government in 2009. It has the lowest population density in the world*, with just ~17,000 people living in the capital city of Nuuk.

Nuuk is a weird and wonderful place to live in. I have already built a strong repertoire of funny anecdotes from trying to navigate my first month of living here – from being asked ‘do you have any meat or fish?’ before getting in a taxi, to watching icebergs float past my house in the fjord. It was a big step to make this move, and so far it’s been absolutely worth it.

I moved here to take up a permanent position at Asiaq Greenland Survey as a remote sensing specialist. It’s a small shift from academia, but we still conduct research and write scientific papers just like everyone else. The pace of work is fast here and I am still settling in and figuring everything out, but it is the refreshing change I needed after finishing my PhD. We are currently working on a handful of ESA projects which I have fallen into nicely, and writing proposals for future projects.

Overall, I hope I can kickstart writing these updates again. They were fun to do during my PhD, and it was a pity I lost that in the lead up to handing in my thesis and during my first postdoc. I hope I can report on projects I work on in Asiaq, and talk about life in Greenland generally. Here’s to this platform’s rejuvenation!

Out and about around Nuuk

This was from a hiking/fishing trip just outside of Nuuk. Fishing here is very easy if you like cod. They say here that if you don’t get any fish after two or three casts then you should move on to the next spot! The fish we caught this day were massive!

*Statistics Greenland have unbelievably thorough analytics on Greenland (e.g. their 2019 report)

Calving dynamics at Tunabreen, Svalbard, published in Annals of Glaciology

Annals of Glaciology has recently published our work examining calving dynamics at a tidewater glacier in Svalbard (click here to see article). In the study, we use time-lapse images captured every 3 seconds to document and analyse calving events at Tunabreen that occurred over a 30-hour period in August 2015.

A time-lapse camera installed at Ultunafjella, overlooking the calving front of Tunabreen (August 2015)

Our time-lapse camera installed in the field, overlooking the terminus of Tunabreen, a tidewater glacier in Svalbard. We captured images every 3 seconds over a 30-hour period in August 2015, from which we could distinguish calving events in high levels of detail.

In total, we acquired 34,117 images. Compiled together, these images produced a sequence that we distinguished 358 individual calving events from. We could also discern the style of each calving event, inferring the controls on calving at this particular glacier front.

Calving at Tunabreen was characterised by frequent events during our monitoring period, with 12.8 events occurring every hour on average. Most calving events were small in magnitude, relative to those observed at other tidewater outlets such as those in Greenland (e.g. James et al., 2014), and other tidewater glaciers in Svalbard (e.g. How, 2018).

Fig 3 from How et al. (2019)

All documented calving events and styles observed at Tunabreen, first distinguished in the image plane (left) and subsequently georectified to extract real coordinates and compare to ice velocity (right) (Figure 3 from How et al., 2019)

Five calving styles were observed – waterline events, ice-fall events, stack topples, sheet topples, and subaqueous events – based on the relative size and mechanism of failure. A high majority of calving events (97%) originated from the subaerial section of the ice cliff, despite the fact that 60–70% of the terminus is below sea level. Subaqueous calving events were very rare, with only 10 observed over our monitoring period. The rarity of subaqueous events indicates that ice loss below the waterline is dominated by submarine melting, with the only local development of projecting ‘ice feet’.

Over two-thirds of observed calving events occurred on the falling limb of the tide. suggested that tidal level plays a key role in the frequency of calving events. Calving events were also roughly twice as frequent in the vicinity of meltwater plumes compared with non-plume areas, indicating that turbulent water promotes temrinus instability. The presence of a ~ 5 m undercut at the base of the glacier further supports the idea that ice is being excavated from below the waterline. 

An example of a subaqueous calving event at Tunabreen, occurring in the plume area. The section of ice front shown is approximately 350 m, and the iceberg is 40 m wide

An example of a subaqueous calving event at Tunabreen, captured using our time-lapse camera. The section of ice front shown is approximately 350 m, and the iceberg is 40 m wide

We conclude that, based on the observations, calving rates at Tunabreen for this observation period may simply be paced by the rate of submarine melting. Similar dynamics have also been observed at other tidewater glaciers in Svalbard (e.g., Chapuis and Tetzlaff, 2014; Pȩtlicki and others, 2015), Greenland (e.g., Medrzycka et al., 2016) and Alaska (e.g., Bartholomaus et al, 2015). This being the case, the inference of calving rate from submarine melt rate would greatly simplify the challenge of incorporating the effect of melt-under-cutting in predictive numerical models; at least for this type of well-grounded, highly fractured glacier.

To read more about this research, please check out our paper published in Annals of Glaciology.

A dummies guide to… my PhD thesis on glacier dynamics

Recently my PhD thesis was made available online through the Edinburgh Research Archive, titled ‘Dynamical change at tidewater glaciers examined using time-lapse photogrammetry’. The thesis is 342 pages long which would be a marathon to get through for anyone, so here is a short synthesis.

In a nutshell

Title: Dynamical change at tidewater glaciers examined using time-lapse photogrammetry

Goal: To understand processes linked to dynamical change at tidewater glaciers.

Three main aims:
1. Examine subglacial hydrology and its influence on glacier dynamics at Kronebreen, a fast-flowing, tidewater glacier in Svalbard
2. Investigate controls on terminus conditions and calving processes at Tunabreen, a surge-type, tidewater glacier in Svalbard
3. Develop a suite of photogrammetry tools for obtaining measurements from oblique time-lapse imagery

Techniques used: Monoscopic time-lapse photogrammetry, hot water borehole drilling, bathymetry surveying, satellite feature-tracking, passive seismic monitoring, melt/runoff modelling

Acquiring data from the field

Me at camera site 8b, Kronebreen, Svalbard (May 2015)

An example of one of our time-lapse cameras, installed at Kronebreen

High-detail monitoring of glacier termini is challenging. We decided to employ time-lapse photogrammetry as our primary technique in this study given that it can provide high-resolution data acquisition (e.g. 1 image every 3 seconds, over 24 hours) as well as appropriate acquisition rates for longer-term monitoring where needed (e.g. 1 image every hour over the course of a melt season). Therefore we can acquire different temporal frequencies depending on which aspects of the glacier system we want to examine.

Between 2014 and 2017, we deployed 7 – 14 time-lapse cameras at two glaciers in Svalbard (Kronebreen and Tunabreen) to monitor various aspects of the glacial system – ice flow, terminus retreat, supraglacial lake drainage, meltwater plumes, and local fjord circulation. We combined the findings from these images with other datasets (e.g. borehole measurements, bathymetry surveys) in order to examine dynamical change at a high level of detail.

Finding #1: Spatial variation in Kronebreen’s ice flow is primarily controlled by meltwater routing at the glacier bed

From our time-lapse images over the 2014 melt season, along with borehole data analysis, melt/runoff modelling and hydropotential modelling, we found that spatial variations in ice flow at Kronebreen were primarily controlled by the location of subglacial meltwater channels.

Efficiency in subglacial water evacuation varied between the north and south regions of the glacier tongue, with the north channel configuration draining a large proportion of the glacier catchment through persistent channels, as indicated by hydropotential modelling. Channel configurations beneath the south region of the terminus were vastly different, with rapid hydrological changes evident and cyclic ‘pulsing’ suggested from the observed meltwater plume activity.

These differences in subglacial hydrology are reflected in ice flow, with faster velocities experienced in the south region of the glacier, facilitated by enhanced basal lubrication and sliding. Two speed-up events were observed at the beginning of the 2014 melt season, the second being of significant importance given that it occurred at the end of the melt season and enabled fast flow through the winter season. It is suggested that this event was caused by an abnormal high rainfall event which overwhelmed an inefficient hydrological regime entering its winter phase. This phenomena highlights that the timing of rainfall events at tidewater glaciers is fundamental to their impact on ice flow.

The Cryosphere Kronebreen maps

Sequential velocity maps (left) and velocity change maps (right) of Kronebreen showing the first of two speed-up events experienced during the 2014 melt season.

Finding #2: Terminus stability is inherently linked to both atmospheric and oceanic variability at Tunabreen. In particular, calving activity is primarily facilitated by melt-undercutting

Terminus conditions at Tunabreen were examined on two differing temporal scales:

  1. Over a one month period in peak melt season using time-lapse images acquired every 10 minutes
  2. Over a 28-hour period in August 2015 using time-lapse images acquired every 3 seconds

Over the one-month observation period, the terminus retreated 73.3 metres, with an average retreat rate of 1.83 metres per day. The frontal ablation rate fluctuated between 0 and 8.85 metres per day, and 1820 calving events were recorded of which 115 events were simultaneously detected from passive seismic signatures recorded in Longyearbyen. Overall, strong links were found between terminus position changes and both sea surface temperature and air temperature, suggesting that atmospheric forcing plays a larger role in terminus stability than previously considered.

Calving events at Tunabreen over a 30-hour period in August 2015, captured using high-resolution time-lapse photography (one photo every three seconds). Calving events are categorised as subaerial (i.e. ice falling from the front above the waterline), subaqueous (i.e. ice breaking off from the front beneath the waterline), both (i.e. large calving events which contain both subaerial and subaqueous originating ice) and unknown (caused by concealment or poor visibility).

Calving events at Tunabreen over a 28-hour period in August 2015, captured using high-resolution time-lapse photography (one photo every three seconds). Calving events are categorised as subaerial (i.e. ice falling from the front above the waterline), subaqueous (i.e. ice breaking off from the front beneath the waterline), both (i.e. large calving events which contain both subaerial and subaqueous originating ice) and unknown (caused by concealment or poor visibility)

Calving activity at Tunabreen consists of frequent events, with 358 calving events detected from the 28-hour, high-frequency time-lapse sequence (i.e. 12.8 events per hour). The majority of these calving events (97%) occurred above the waterline despite the fact that 60-70% of the terminus is subaqueous (i.e. below the waterline). This suggests that ice loss below the waterline is dominated by submarine melting, rather than the break off of large projecting ‘ice feet’.  In addition, calving events are twice as frequent in the vicinity of the meltwater plumes, with visible undercutting (approximately 5 metres) revealed from the bathymetry side profiles. Overall, this suggests that enhanced submarine melting causes localised terminus instability at Tunabreen.

Finding #3: PyTrx is a viable Python-alternative toolbox for extracting measurements from oblique imagery of glacier environments

PyTrx velocities

An example of PyTrx’s capabilities in deriving surface velocites at Kronebreen, Svalbard. Velocities are calculated from the image using a sparse feature-tracking approach, with unique corner features identified using Shi-Tomasi corner detection and subsequently tracked using Optical Flow approximation. In this example, 50 000 points have been successfully tracked between an image pair from Kronebreen, producing a dense collection of velocity points.

Time-lapse photogrammetry is a growing method in glaciology for providing measurements from oblique sequential imagery, namely glacier velocity. When we began processing our time-lapse images, we found that there were few publicly available toolboxes for what we wanted and the range of their applications was relatively small. For this reason we decided to develop PyTrx, a Python-alternative toolbox, to process our own data and also aid the progression of glacial photogrammetry with a wider range of toolboxes.

PyTrx is an object-oriented toolbox, consisting of six scripts that can be used to obtain velocity, area and line measurements from a series of oblique images. These six scripts are:

  1. CamEnv: Handles the associated data with the camera environment, namely the Ground Control Points (GCPs), information about the camera distortion, and the camera location and pose
  2. DEM: Handles data related to the scene, or Digital Elevation Model (DEM)
  3. FileHandler: Contains functions for reading in data from files (such as image data and calibration information) and exporting output data
  4. Images: Handles the image sequence and the data associate with each individual image
  5. Measure: Handles the functionality for calculating homography, velocities, surface areas and distances from oblique imagery
  6. Utilities: Contains the functions for plotting and interpolating data

PyTrx has been used to process the data presented previously, and is freely available on GitHub with several example applications also. These examples include deriving surface velocities and meltwater plume footprints from time-lapse images of Kronebreen, and terminus profiles and calving event locations from time-lapse images of Tunabreen.

Related links

This thesis is freely available to download from the Edinburgh Research Archive

How et al. (2017) The Cryosphere – Examining the subglacial hydrology of Kronebreen and its influence on glacier dynamics 

How et al. (In Review) Annals of Glaciology – Observations of calving styles at Tunabreen and the role of submarine melting in calving dynamics

How et al. (2018) Geoscientific Instrumentation, Methods and Data Systems – Presenting the PyTrx toolbox and its capabilities with oblique imagery of glacial environments

PyTrx – PyTrx toolbox code repository, hosted on GitHub

Subglacial hydrology at Kronebreen, Svalbard, published in The Cryosphere

The Cryosphere recently published our work on Kronebreen, a fast-flowing tidewater glacier in Svalbard (click here to see the article). The study examines subglacial hydrology and its influence on basal dynamics over the 2014 melt season, with simultaneous observations of water pressure at the bed, supraglacial lake drainage, meltwater plume activity, and glacier surface velocities. In addition, melt/runoff and hydraulic potential were modelled in order to estimate surface melt production, and the routing of meltwater at the bed. This built a nice record from which we could establish a robust, theoretical picture of how water is channeled at the bed.

One of the key findings is the difference in drainage beneath the north and south regions of the glacier terminus, which is linked to spatial variations in surface velocity. The study also shows a consistently high water pressure at the glacier bed throughout the melt season. These readings were collected from a borehole that was drilled approximately 3 km upglacier of the terminus. Borehole records from tidewater glaciers are rare but the few early studies that currently exist, including this one, suggest that bed conditions at tidewater glaciers are persistently pressurised, with a high hydraulic base-level that permits fast flow.

The Cryosphere Kronebreen site map figure

Figure 1 from the TC paper: The site map of Kronebreen, along with the location of the three groups of supraglacial lakes (C1, C2 and C3) that filled and drained during the 2014 melt season. These lakes were monitored by seven time-lapse cameras, which were installed on the rock outcrops surrounding the glacier tongue (denoted by the orange numbered locations). These lakes drained sequentially in an upglacier fashion, similar to the speed-up event at the beginning of the melt season. The starred location is where the borehole was drilled and the pressure sensor was installed.

The Cryosphere Kronebreen maps

Figure 5 from the TC paper: Sequential velocity maps (left) and velocity change maps (right) of Kronebreen, derived from TerraSAR-X imagery. The south region of the glacier tongue is faster flowing than the north region throughout the melt season. We argue that this reflects a difference in drainage efficiency. An early-season speed-up event is  depicted in the velocity change maps, which originates from the terminus and propagates upglacier. Similar speed-up events occur year-on-year at Kronebreen. These may reflect changes at the terminus early in the melt season which promote longitudinal stretching, and/or reflect a seasonal hydraulic overhaul which promotes basal sliding.

Further reading

The Cryosphere paper

Other studies at Kronebreen (here and here) which show early-season speed-up events

Borehole study at a tidewater glacier in Patagonia 


Ptarmigans love time-lapse cameras!

Ptarmigan at Kronebreen 01

We have been setting up time-lapse cameras in Kongsfjorden, Svalbard since 2014 to observe glacier change over time. Ptarmigans have been known to nest by these cameras. One particular camera is their favourite! This camera was set up on a rocky outcrop called Garwoodtoppen to measure velocities over Kronebreen glacier. 

Ptarmigans at Kronebreen 02

Sometimes more than one ptarmigan will come to sit in front of this camera…

Ptarmigans at Kronebreen 03

…And we have noticed changes in their appearance through the season. Their feathers are normally white in colour over the winter and spring, but change to grey/brown in the summer. Over the course of a season (May – September), we capture roughly 20 ptarmigans in our images (out of a possible 6000 images). 

Ptarmigan at Kronebreen 04

Although these images have been useful to monitor ptarmigan activity in this area of Svalbard, they are also a bit of a nuisance for tracking glacier movement. When they are in front of the camera, they block a significant patch of the glacier that we are monitoring. Silly ptarmigans!