The Pensford 10k

Another week, another race, this time the Pensford 10k event, but lets take a little step back first.

At the beginning of my ‘racing’ (very loose term for official events) calendar selection I had the aim of adding at least one, preferably two events per month to ensure that the pressure to line up against others kept me honest and provided the motivation to get my backside off the couch and training. When the event is close by and offers an “undulating, and seldom flat course” with a nice 50m and 100m steep climb in the middle, it is worth a try, and try I did.

Race day I lined up with nearly 200 other people, but amongst the normal pre-race banter I was hearing whispers from the more experienced runners: ‘it only gets going around the 5k mark, watch out’, and ‘the hill in the middle is a killer’. With this in mind, I moved myself from the 40min+ starting group to the 50min+ group all whilst I was wondering what I had gotten myself in to.

I set off at a slow but steady pace, trying to conserve energy for ‘the hills’ although at this point I was overtaking people in the faster group which had me a little worried. Despite a few hills the first 5k was fine, up, down, up, down, and up again but rather doable. Then the 5k marker appeared and right on cue, the hill that was the focus of many a conversation at the start of the race.

Now, a 100m climb sounds rather small, and on paper it is, but when you are trying for a good race time, with tired legs, a dislocated shoulder (did I mention that?) and enjoying the best the South West has to offer you in terms of windy, cold weather, the start of this tiny little hill was not welcome. I’d set my Garmin watch to alert me when I was slower than an 8:30 pace thinking me and the beep were never going to meet but unfortunately the slow beep was my unwelcome companion the whole time I climbed this section.

Despite a slow time (51:27) I really did enjoy it. I am confident I could knock a large chunk of time off that next time, in good health, and with next year being the 30th anniversary of the race, I think I will be back to prove that.

Next on the agenda, the Keynsham 10 miler but I am sorely tempted to add the Bath Ultra Marathon in September to the list, after all, I’m still looking for races.

Running the Cheddar Gorge 10k

This week I ran the 2016 inaugural Cheddar Gorge Challenge event, the 10k race. Billed as a ‘lumpy course’ these series of runs offer “more climbing just getting to the start than you will in most other events”. With the affectionately named Hell Steps towards the end this is a tough run but more importantly, it is a fun event. Cheddar is beautiful, steeped in history and picturesque from the bottom of the gorge let alone from running up and down it so the prospect of completing 3 races (10k, half marathon, and marathon) in and around the area was too enticing to pass up.

The terrain, according to the website is “steep in places and very steep everywhere else” which sets the scene but to be honest I arrived at the event fully trained but expecting the worse. In reality there were hard sections, easy sections, a little doubt that I could actually complete the event around the 5k mark, and a ‘this is great’ moment around 7k. The distance is tiny compared to the training I have done but it was a combination of the race-day ‘too fast start pace’ and the lumpiness that hit my tired legs hard. There was never a time that I truly wanted to give up but there were times when I thought about a brisk walk rather than running. Overall I am happy to say that even up the Hell Steps I broke into a run and completed the course in a not impressive, but rewarding 58:28. I quietly wanted a sub-1hr finish but given the course I was a little sceptical about that but even though the wind was against us most of the race (how does it constantly blow directly at you regardless of your orientation?) I hit my goal.

So why would I put my body through this I hear you ask. Well, this year I am doing something out of my comfort zone, something a little crazy (for me), and something that I hope will make a difference to others. This year I am running, a lot, and for a good reason. This year I am running to raise money for MacMillan Cancer Support. To read about my story please click the link which is also where you can donate to this very worthy cause. I am putting myself through several challenges, including a marathon in the Himalayas of Nepal and a 45 mile Ultra Marathon because there are people out there who just can’t, so lets together make their lives a little easier during a really hard time.

So if you can, please donate.

A Change of Scenery

A few weeks I joined Canonical and for the eagle-eyed you’d realise this is actually for the second time. Previously at Canonical I spent my days with the Mobile Team, realising the goal of a good Linux on ARM experience which eventually culminated in the foundation of Linaro. This time I am equally excited about another formative stage of technology, that of IoT and the possibilities of interoperable and extensible devices running a standard Linux operating system.

I personally will be working directly on Snappy Core, the technology used to provide a stable, secure, transactional, and featureful platform for the internet of things (IoT) and beyond. I believe in the future of IoT, big data, and a world full of Mark Weiser’s vision of Ubiquitous Computing and I am excited to be part of it.

Running in Nepal

Nepal is such a wonderful place. Steeped in history and culture with some of the most breath-taking sights to be seen, Nepal is the home to Buddhism, the Himalayas, and of course the mighty Mt Everest. Despite all this wealth, the country and the people themselves face a lot of challenges. Economically Nepal is considered a third-world country with many people in utter poverty. To compound this hardship, Nepal has also experienced terrible earthquakes that have left many dead and even more without basic needs such as accommodation, access to food and clean water, and education. Last years earthquakes were terrible and it continues to happen. This had me thinking, wouldn’t it be great if I could do something, no matter how small, to help out in some way?

Through my love of running I got to learn about the Impact series of Marathons and right there as the inaugural run was Nepal. Impact Marathons has lofty goals, in fact they are aiming to contribute to the 17 UN Global Goals through running. “Our runners will truly see the impact of their time, money and resources by visiting and meeting all of the projects they are supporting”. As part of the week long trip we will be helping repair a school that was badly affected by the earthquake as well as undertaking other community projects. The actually marathon will be at the end of the week and promises to combine stunning views, up to 2300m of altitude, and 26.2 miles of running for a great cause. I am really looking forward to it.

This year I am hoping to complete several new challenges and up my running to ultra distances. At the moment, Nepal is my final planned race of the year but between now and then there is a lot of training to do.

Parkrun

I have a confession to make. While I have publicly supported the parkrun initiative for some time and I wholeheartedly believe it is a great idea, I have never actually ran one myself. There have been many excuses from family conflicts at the weekends to blaming the weather but this week I thought I would actually ignore all of this and partake in the spectacle. I chose the closest event, which for me was Southwick Country Park in Trowbridge and arrived before the customary 9am start time. Nearly 300 people turned up to run the wet and muddy course and despite not knowing anything about the logistics of Park Run (you pick up a finishing token at the end and have that scanned with your personal parkrun barcode) I mingled into the crowd ready to run.

The course itself was great. 3 laps around a semi-gravelled path with some sections completely covered by 6 inches of rain water. The first lap around I tried to avoid the water but this this only pushed you onto the slippery surrounding mud so for lap 2 and 3 it became obvious that the best option was to just get your feet wet.

I did not manage any personal bests but that was not the point; the event itself was great fun and the volunteers were excellent. This may have been my first parkrun but it is definitely not my last, I will be there again this Saturday, 9am, ready to do it all again.

Fujitsu SnapScan 1300i with Ubuntu

I’ve been using the Fujitsu ScanSnap 1300i on Mac OS X for some time now in the pursuit of a paperless life but now that I am using Ubuntu more and more it was apparent that I was going to have to get this little scanner working on Linux. Searching the internet threw up a few interesting articles but nothing worked 100%. In the end the steps I used under Ubuntu 15.10 were:

Install sane and gscan2pdf:

$ sudo apt-get install sane gscan2pdf

Download the scanner firmware from http://www.openfusion.net/public/files/1300i_0D12.nal copy it to the relevant directory:

$ wget http://www.openfusion.net/public/files/1300i_0D12.nal && sudo mkdir /usr/share/sane/epjitsu && sudo cp 1300i_0D12.nal /usr/share/sane/epjitsu

Open up the scanner lid first then initialise it with:

$ sudo scanimage -L

Run the gscan2pdf application:

$ gscan2pdf

You can tweak some of the scanner options by clicking the scan button and playing around with the pop-up box tabs. For me the most important were to select duplex as I regularly scan double-sided documents and to select colour.

Happy scanning.

Running Ubuntu Snappy Core Virtualised on Mac OSX

Although most of the documentation out there today shows you how to run Ubuntu Snappy Core on an Ubuntu desktop, it is also pretty simple to do this on Mac OSX. In short:

Download the Ubuntu Snappy Core image from:

http://releases.ubuntu.com/15.04/

You will need the amd64 version of Snappy.

Unarchive the file:

unxz ubuntu-15.04-snappy-amd64-generic.img.xz 

Then convert the image into something that VirtualBox can run:

qemu-img convert -f raw -O vmdk ubuntu-15.04-snappy-amd64-generic.img ubuntu-15.04-snappy-amd64-rpi2.vmdk

At this point you want to create a new VM with VirtualBox. Make sure you create a Linux Ubuntu image but when you get to the Hard drive section select “Use an existing virtual hard drive file”. Navigate to your .vmdk image and click create. Now, when you start the VM you should be greated (pretty quickly) with the login prompt from Snappy.

Installing Ubuntu Snappy Core on a Rasberry Pi 2 using a Mac

This is a short guide to installing Ubuntu Snappy Core on a Rasberry Pi 2 using a Mac. It is pretty straight-forward but there are a couple of areas where you can get caught out.

First, download the Ubuntu Snappy image from:

As of writing the latest release was:

  • ubuntu-15.04-snappy-armhf-rpi2.img.xz

Insert your SD card if you haven’t done so already and use diskutil to find it.

$ diskutil list

Make sure you are confident that you know exact which disk is your SD card before proceeding. The relevant part of my output was:

/dev/disk4 (external, physical):
#:                       TYPE NAME                    SIZE       IDENTIFIER
0:     FDisk_partition_scheme                        31.9 GB     disk4
1:                  Windows_FAT_32 Untitled          31.9 GB     disk4s1

Umount the disk with:

diskutil unmountDisk /dev/disk4

Then proceed to write your Ubuntu image to the card with:

$ unxz -c ubuntu-15.04-snappy-armhf-rpi2.img.xz | sudo dd of=/dev/rdisk4 bs=32m && sync

Notice the use of ‘r’ in front of the /dev/disk4 file.

Thats all there is to it. Pop the SD card into your Rasberry Pi 2 and start using Ubuntu Snappy Core.

Life with Apple's iWatch

I confess, I’m a bit of a gadget hound. I own four different smart watches all with different OSs:

  • Pebble with PebbleOS
  • Samsung Gear with Tizen
  • Motorola 360 with Android Wear
  • and now Apple iWatch with WatchOS

When I first got the Pebble (Kickstarter model) I was instantly impressed. It was a device that lasted days, gave me notifications at a glance, and allowed me keep my phone in my pocket unless it was really needed. The trouble is that I got bored pretty quickly with it’s lack of functionality and the Samsung Gear looked enticing. With the ability to make calls, as well everything the Pebble did and more, it seemed like a no brainer. It originally came with a version of Android so buggy it made the device pretty much unusable. Much later this was ‘upgraded’ to a Tizen OS which to its credit was better, but was still limited. Enter Android Wear. In a blaze of publicity at Google I/O 2014 this wearable OS seemed perfect, so I went and purchased the Motorola 360, arguably the best looking device on the market. Unfortunately, this was also crippled. No sound (so no beeps, no notification noises), no ability to make and receive calls, and more importantly no real way to get back to notifications once they were dismissed as well as no real compelling stock applications, this watch just felt like a device to vibrate every time something happened and to ignore at all other times. Android Wear just wasn’t compelling enough so I always gravitated back to Garmin watches (Forerunner 620, Forerunner 920t). Now there is the Apple Watch.

I’m still in the honeymoon stages with it at the moment but I have been wearing it exclusively for the past three weeks. It’s proven to be a useful aid: the fitness app is a bit poor and I’ll revisit this point in another blog post but overall the experience of using it has been pleasant. I can read messages and email, make and receive calls, the calendar app is super useful, and I’ve found that I use it extensively for reminders. All in all it has been a success so far but it is not without its problems. Watch OS 2.0 promises to improve the device further and I am certainly looking forward to it but for now, the iWatch is the best smart watch in an immature market.

Humans

As a scholar of software engineering with a particular interest in the field of ubiquitous computing and artificial intelligence, the recent series by AMC, “Humans” really did peak my interest. Is it something based on sensationalism or alternatively something that could be considered grounded in reality? Well, I believe that it is a drama that reflects more on the latter than the former. I really like the concept so far and it raises questions that only academia have explored in detail before movie studios love; concepts such as artificial understanding, consciousness, love, and the projection of human traits upon non-human subjects (anthropomorphism).

Sure, the movie industry have toyed with a multitude of these concepts with many dollars flowing in at the box office but what Humans does is ground these in such mundaneness, run of the mill reality, that I really like the play with boundaries, the grey line that the whole programme is toying with. What defines humanity and what distinguishes it from an imposter? What is the point in humans learning when machines can do it much better? Maybe more importantly do the majority of people, the Joe Bloggs of the world, care about the gap in what is possible with strong AI and what is human; a posit that I believe will be in the forefront of minds for the next 50 years.

In short, I really do like Humans so far.

Creating bootable USB images on the Mac

Creating a bootable image for installing a Linux OS is pretty straight-forward but when you are doing this on the Mac there is a specific way it needs to be done. I alway use USB drives for this purpose so what follows are the steps needed to create a bootable USB stick from a Linux .iso image.

I presume you have already downloaded your favourate Linux distribution in .iso format, below I’m using Debian Jessie.

First conver the .iso image into a .img image.

$ hdiutil convert -format UDRW -o debian-jessie-DI-rc1-amd64-netinst.img debian-jessie-DI-rc1-amd64-netinst.iso

You then need to find your USB drive.

$ diskutil list

Look for USB device. I’ll use /dev/disk7 for this example. First make sure it is unmounted.

$ diskutil unmountDisk /dev/disk7

Then copy the image to the USB stick. CAUTION This will overwrite anything that is already on the drive.

$ sudo dd if=debian-jessie-DI-rc1-amd64-netinst.img.dmg of=/dev/disk7

Safely eject the USB disk before using it for booting on your target device.

$ diskutil eject /dev/disk7

And there you have it, a bootable, Linux install USB drive.

Trusted Execution Environments in Android

Continuing on from my post about TrustZone it seems that there is a lot of interest in hardware-backed security for Android and what you can do with it. One of the most interesting things that a hardware-isolated area can do for devices, whether that be a dedicated co-processor or technology such as TrustZone, is to provide a trusted enviroment dedicated to protecting your most valuable assets and the operations that are performed on them. Installing something like a micro operating system in this divide can give you a lot of features that the main OS just cannot gain access to and is the thrust of standards bodies such as Global Platform 1. This micro OS, or to use the popular parlance: a Trusted Execution Environment (TEE), is becoming more important in a world of one-click / swipe / wave-a-device payments and device authorisation and over the coming years will see a surge in popularity not only from independant vendors but from the large OS vendors too. But lets take a step back.

The concept of a Trusted Execution Environment is to provide a secure area of the main processor, memory, and peripherals, that can be used to perform privileged operations. First defined by the Open Mobile Terminal Platform (OMTP) forum in their Advanced Trusted Environment:OMTP TR1 standard 2 and later adopted by Global Platform in their standardisation effort, the TEE has become a bridge between pure software security mechanisms and hardware-only solutions. The TEE uses the concept of isolation that technologies such as TrustZone enable to execute in the processors Secure World mode.

The TEE can be a fully-functional operating system offering software developers the opportunity to create Trusted Applications: applications that reside in the Secure World and perform security-critical functions outside of the control of the main operating system running in the Normal World. An example of such a Trusted Application can be a Trusted User Interface (TUI) - a display that is presented to the user completely protected by the Secure World and inaccessible to the main operating system such as Android. The interface could display sensitive information such as passwords and be confident that attacks such as screen scraping or video buffer capture would not reveal anything.

It is clear that the popularity of TEEs is increasing. Based on one commercial TEE vendors press releases the adoption rate of the Trustonic TEE is reported to be over 100m devices every 6 months (source: http://www.trustonic.com - figures from February 2014 to July 2014) although wide-spread utilisation by third-party developers is yet to be exploited. Ekberg et al 3 attribute this to a lack of access to the TEE stating that “Despite TEE’s large-scale deployment, there’s been no widely available means for application developers to benefit from its functionality as mobile device manufacturers have restricted TEE access to their internal use cases.”, but also admit that standardisation could potentially solve this issue. Recent announcements by companies such as Linaro point to a more open access model 4 but we are yet to see commercial devices with OP-TEE technology.

In short, TEEs are here to stay and I expect that the likes of Apple and Android will open up access to this trusted area for more developers to enhance the security of their applications in the near future.

What are you passionate about?

I have recently been reading the book entitled Talk Like TED Carmine Gallo which promises to bestow the virtues of great public speaking upon all who read it. Early on in the book there is a rather salient point that got me thinking, a point that starts with a simple question, “What are you passionate about”. Now there are quite a few things I am passionate about but in the context of Software Engineering, my chosen career path, it is something that underpins all the great projects that over time I have really enjoyed working on. What is it? Data.

I am passionate about data, specifically the conclusions you can draw from it. This is not to say the actual gathering of data, although that can be quite interesting in itself: constructing tools and processes as you squirrel away the nuts of information that together paint a picture that no one individual data point can allude to. I am more passionate about the ‘Wheres Wally’ dance: the finding of that little something you’ve been looking for in a sea of noise, the epiphany, the moment, the unveiling. The answer to the puzzle that is something you intrinsically know is just outside your grasp and that with the data, that collection of measurements and information, the answer will magically appear. The puzzle that is made up of a thousand pieces and by putting them all together it becomes clear. That is what I’m passionate about. I guess my career has always followed that route of problem solving.

Software Engineering is a great field to be in if you enjoy problem solving: you get to create a solution based upon parts constructed with only your imagination, a programming language, and your favourite text editor. In my experience, the first solution you produce is often not quite what you were looking for, and the itch remains. You continue to iterate, introduce bugs, fix bugs, thinking of new and novel ways to answer your initial questions and finally you have something that not only works, it satisfies that itch. When you employ this process to scratch a larger itch, a higher-level more abstract problem that requires the gathering and analysis of data I find there is satisfaction from the initial problem solving during development plus the benefit of discovering that pattern or snippet of information that maybe you only thought was there before but now is proven with the data. Maybe this explains why I have an affinity with Pervasive Computing and, in its latest incarnation as a buzz word - Internet of Things (IoT). The topic of Data Inference, that is what I really enjoy.

I’ve gathered much data over the years: email achives and usage data, energy monitoring and the subsequent discovery of inefficient appliances, health data with Fitbit and Garmin or lifestyle monitoring with Slogger, it can all be combined to do wonderful things. But there is a tendancy to gather data just of the sake of it and I have certainly been guilty of that but I am starting to take a step back and trust the data more - to make informed decisions based upon it - so lets see how that goes this year. Big data is definately here, but the more important point everyone should be asking is “What do we do with all that data and how can it benefit humanity?”.

TrustZone For Android Mobile Security

Recently I was asked to provide a quick, high-level introduction to TrustZone and how it could potentially improve the security on Android platforms. Any response to this is tricky: TrustZone is just a mechanism built in to a platform that if unused can do very little for device security but when utilised to its fullest, can create a totally seperate environment dedicated to protecting your most important secrets. But first a bit of background.

According to Bloomberg 1 ARM’s chip designs are found in 99% of the world’s smartphones and tablets; 2013 alone saw ARM’s partners ship over 10 billion chips (source: ARM Strategic Report 2013). Popular devices such as the Apple iPhone and iPad, Amazon’s Kindle, and Samsung’s flagship Galaxy series all use a Central Processing Unit (CPU) based on an ARM design. In 2004 ARM released its design for a hardware-enforced parallel execution environment for the PB1176 and ARMv7 architectures that was adopted into all later application processor designs.

TrustZone itself is an implementation of device-level security utilizing extensions to the CPU and Advanced Microcontroller Bus Architecture (AMBA), or memory bus. By connecting all these components together in a homogeneous architecture it is possible to contruct two distinct ‘worlds’, a “Secure World” and a “Non-Secure World” (or “Normal World”) 2. The two modes are orthogonal to each other with the Secure World enjoying full access to all memory regions and priviledged CPU areas whereas the Normal World can be restricted. This arrangement is configured during the boot process. The interface between the two worlds is governed by a special Secure Monitor Mode, accessible via an interrupt instigated with the Secure Monitor Call (SMC) instruction. Identification of which world the processor is currently executing it is possible by the use of a extra ‘flag’ known as the NS, or Non-Secure bit. All components that wish to use the functionality provided by TrustZone must beaware of this flag.

With TrustZone it is possible to isolate an area of the CPU, memory, and peripherals for use by a trusted software component called a Trusted Execution (TEE) 3 or other such privileged software. For example, Android’s implementation of the core crytographical keystore functionality, KeyChain, can use hardware components such as TrustZone, Sim Card, or Trusted Platform Module (TPM), to enhance overall security. By using TrustZone a device can provice secure software functionalty, backed up by the hardware it is running on.

It is clear that with more widespread use TrustZone could benefit an increasingly mobile society who expect to do the most secure of operations with their devices.


  1. http://www.bloomberg.com/bw/articles/2014-02-04/arm-chips-are-the-most-used-consumer-product-dot-where-s-the-money] [return]
  2. J. Winter. Trusted computing building blocks for embedded linux-based arm trust- zone platforms. In Proceedings of the 3rd ACM workshop on Scalable trusted com- puting, pages 21–30. ACM, 2008. [return]
  3. J. Winter. Trusted computing building blocks for embedded linux-based arm trust- zone platforms. In Proceedings of the 3rd ACM workshop on Scalable trusted com- puting, pages 21–30. ACM, 2008. [return]

Getting back into blogging

Its been a while, in fact it has been around a year since I updated this site (to be fair I did write a few posts on another blog during that period … excuses, excuses) which I attribute to a increasingly busy schedule but more to a lack of enthusiasm. So, in an attempt to get back into this blogging lark I thought it would be a good opportunity to redesign the site with Hugo, a static, but more importantly Markdown-based web engine, and put up a few articles on something dear to my heart, Software Engineering. So expect more development related posts interspersed with running, triathlon, travel, and other randomness as I attept to do this on a semi-regular basis.

Oh, and if you are looking for any of my past entries from 2007 onwards, they will be back up shortly as I figure out how to convert WordPress content to Hugo and still keep some form resemblence to the original post.