Welcome to my thoughts splurged out onto the electronic page. Have a look at the most recent posts below, or browse the tag cloud on the right. An archive of all posts is also available.

RSS Atom Add a new post titled:


Just a link to the code that I wrote for loading the human calendar onto the lcdsysinfo screen that I brought on Ebay.

#!/usr/bin/env python

# Script to upload the human calendar to an lcdstatus screen
# Requires the python lcdstatus screen library at:
# https://github.com/dangardner/pylcdsysinfo
# Written by John Cooper 2013

#    This program is free software: you can redistribute it and/or modify
#    it under the terms of the GNU General Public License as published by
#    the Free Software Foundation, either version 3 of the License, or
#    (at your option) any later version.
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    GNU General Public License for more details.
#    You should have received a copy of the GNU General Public License
#    along with this program.  If not, see <http://www.gnu.org/licenses/>.

import requests
from ?BeautifulSoup import ?BeautifulSoup
#from PIL import Image
from StringIO import StringIO
import pylcdsysinfo
from pylcdsysinfo import LCDSysInfo, ?TextAlignment, ?TextColours, large_image_indexes

r = requests.get("http://api.humancalendar.com/iframe.php?t=2x2&s=250")

page = BeautifulSoup( r.text)
img = page.findAll('img')
print img[0]['src']
r = requests.get(img[0]['src'])
calendar_image = pylcdsysinfo.Image.open(StringIO(r.content))
calendar_image = pylcdsysinfo.simpleimage_resize(calendar_image)
rawfile = pylcdsysinfo.image_to_raw(calendar_image)

d = LCDSysInfo()
#d.write_rawimage_to_flash(1, rawfile)
d.write_rawimage_to_flash(large_image_indexes[slot], rawfile)
d.display_icon(0, large_image_indexes[slot])

Also available as Human Calendar code gist

Posted Fri Aug 9 10:51:26 2013 Tags:

In old versions of gnome the command gnome-screensaver-command -l would lock your screen.

As gnome-screensaver is no more in gnome 3.8 you now have to send a dbus call. I think this is then handled by GDM.

   dbus-send --type=method_call --dest=org.gnome.ScreenSaver /org/gnome/ScreenSaver org.gnome.ScreenSaver.Lock
Posted Thu Jul 11 09:36:16 2013

I needed a coat stand. I had a pile of coats on the floor and some hung up on g-clamps on the back of a door. It was not the best of solutions. So after some searching on the web I decided that there where none that I really liked. So I thought I would make one for myself. Just three months later and I have something that I am happy with.

My final design came to me in the concrete isle at B&Q. So I brought some cement, sand, and a bucket. Next up was 2m of steel box section, Some coat hooks and a set of LED garden lights. (What? Every coat stand needs lights) My idea was simple. Use the bucket as a mould for a concrete base and the steel box as the upright and attach the hooks at the top. The bonus was setting the lights into the concrete for extra class.

I like to let these things form in the back of my mind so I got it all home and put it by the pile of coats for a bit. First off was spacing the lights. As I sat eating a ready meal I noticed that it came in a metal tin just a little smaller than bottom of the bucket I was going to be using to make the base. Sorted. image

I printed out a template with 10 evenly spaced holes round a circle. Stuck it to the tin and drilled some holes. It turns out drilling 10mm holes in tinfoil does not really work and a punch would have been better I think but I got there in the end. Now I fed in all the lights and created this squid like creature.

image image

I decided to stick the lights to the bottom of the bucket using blue tack which mostly worked but I think I would try double sided tape next time I think. The blue tack left the lights quite recessed in the concrete. The idea with the concrete was to form a large lump at the base. By filling the bucket as a mould then turning that out and over I should have a nice smooth dome. Sticking the lights to the bottom of the bucket should make them shine up from the top of the base in the final part so that is what I did. For the box section I just wrapped it in a plastic bag and dropped it into the centre of the bucket so that when the concrete had set I could pull it out again. This did allow me to get it straight and worked pretty well.

image image

I mixed up some rather runny concrete about one to one cement and sand and dribbled it into the bucket. Not really having a clue what I was doing I just hoped that would be okay. I had intended to make three layers with different shades but in the end went with just two. image

I filled up the bucket to just above the level of the tin, trying not to knock the lights off their bases. Ithen let that dry for a few days. Adding a second layer to bring it most of the way up the bucket. I think that I should have made more of an effort to clean up between the layers as it did leave some marks on the edges a little. Just cleaning the edget of the bucket and sweeping it to the middle would have done I think but it looks okay. image

The big reveal came a week or so later as I wanted to be sure that it had a good chance to set. The pole pull out without to many problems. and the bag followed quite simply so I was happy with that. I turned the bucket upside down over a couple of sticks so that it did not fall on the wires and started to tap. With not much effort at all it dropped out. I was fully expecting to have to destroy the bucket to get it off but it seemed to work really well. The top is really smooth and I was really impressed with how it came out. I quite like the fact you can even see the litre markings from the bucket up the side. The join is a little rough and I may end up making a band to cover it up but for now I will just say it adds character.

image image

All but one of the lights had stuck down okay and with just a little work with a screwdriver the blue tack and the edge of the concrete could be cleaned away simply enough. I think the tin foil tray had mostly kept the last one in place so it was not to bad and is just a little more recessed. Putting the pole back into the top and three little wooden feet underneath plus one more for the bottom of the pole. I switched on the lights and had a rather elobrate storage for one hat. I left the lights on for a bit and they did not seem to get even slightly warm so I hope they will be okay. They are rather bright so I might look at running them from 6 or 9 volts rather than the 12 they currently run at. image

Now for the hooks. I brought four simple aluminuum hooks with 3 prongs on each and my first thought was just to drill holes in the steel pole and attach them straight on but somebody pointed out that may leave the hooks a little close to each other. I decided to get a couple of bits of wood and clamp them round the pole and then screw the hooks onto that. It was about that time that my 3D printer sprung back into life so after a little messing about in OpenCAD I designed a block that I could slip over the top of the pole and attach the hooks to. Sorted. image

I then used some match sticks to wedge the top and bottom of the pole in place and then some Sugru to tidy up the hole a little and I was done. image image image image I am pretty happy with it. It's unique and quite neat I think. There are some things that I think I would change. Using sticky tape for holding down the lights would make them more flush. Taking out the pole at the second or third layer and sealing the hole with a plate so that the pole does not sit directly on the floor needing a wooden plate to spread the weight. I think that the join on the concrete could have been cleaner if I had made more of an effort to tidy it up before starting the next layer. I might get a second bucket to mix the concrete in as doing it in an old innocent lunch pot was a little slow going!

Posted Tue Jun 18 22:26:55 2013 Tags:

We have been using puppet for a few years now and have a legacy of modules some of which are common and others that are quite specific. I want to make some changes to one of the common modules. When I have done this in the past I have just changed the module and rolled the changes out to a couple of servers using a dev branch before moving it live but this time I want something a bit more reliable.

Wooden rake

These are some really simple notes on how I got started. This is not going to be a tutorial but should point to a some of the parts need to get started. A good place for more detail is the puppet labs post on next generation puppet module testing

Firstly install the requirements.

Install rvm to get a ruby you can play with.

Install bundler gem install bundler

Create the bundler Gemfile, (Puppetlabs recommend the filename .gemfile but I think Gemfile is clearer)

source :rubygems
puppetversion = ENV.key?('PUPPET_VERSION') ? "= #{ENV['PUPPET_VERSION']}" : ['>= 2.7']
gem 'puppet', puppetversion
gem 'puppetlabs_spec_helper', '>= 0.1.0'
gem 'puppet-lint'

Then install the bundle of stuff bundle. You should now have all the bits required to get going.

Next up we need to create some fixtures. These are helpers that the test code can setup so that your module runs. In this example it just makes sure you have access to all the relevant modules that you require. So create a file called .fixtures.yml (note the stating dot)

    #stdlib: git://github.com/puppetlabs/puppetlabs-stdlib.git
    #apt: git://github.com/puppetlabs/puppetlabs-apt.git
    tomcat: "#{source_dir}"
    authbind: "#{source_dir}/../authbind"
    stdlib: "#{source_dir}/../stdlib"
    apt: "#{source_dir}/../apt"

I have chosen to just use the modules locally as we have them all in our tree, but you can use the commented out part to pull the modules in directly. Rspec will use this file to fill the spec/fixtures/ directory when it builds up a copy of the code for testing.

Next up is a Rakefile (much like a Makefile but for ruby), this allows us to use rake to run the tests and other commands simply. This gives use access to commands like rake spec and rake help.

require 'rubygems'
require 'puppetlabs_spec_helper/rake_tasks'

# rake lint
require 'puppet-lint/tasks/puppet-lint'

I added an extra lone so that you can run lint against the module as well. rake lint

Now we just need a bit of ruby to help the tests run. spec/spec_helper.rb  

require 'rubygems'
require 'puppetlabs_spec_helper/module_spec_helper'

Now you can create your tests and run them with rake spec. I am not going to talk about the tests here as I have only written one so far but the puppetlabs link gives some examples. This is just to get the basics in place.

I think there is also a puppet-rspec-init somewhere that would allow all this to be built automatically in the future.

Update: I also added a .gitignore file in the module root so that the fixtures dir does not get checked in. It just has spec/fixtures in it at the moment.

Posted Wed May 15 11:17:46 2013 Tags:

I came across a cool little tool called Nuvola that is a wrapper around various cloud music services such as Grooveshark and Google Music. It not only makes them into full applications but does things like add support for scrobbling and multimedia controls. I quite like having my web apps appear as full applications and use the "Make Application" function in Epiphany to do it quite often. That way typing "gma" will either switch to the running app or start up an new Gmail window from anywhere on the desktop and I find that regular desktop controls like alt-tab are easier to manage than tabs in a browser for apps that I am using regularly.

This rambling post does have another point to it though. A lot of the cloud music sites still use Flash to play the music. Flash has always caused problems because it's closed source and so people can't fix all the niggling issues. Instead we have hacky work arounds. I present a stack of them here.

When my laptop is docked at work I have two sound cards and for the most part it works okay. I can switch between them without any problems and use my headphones or speakers that are always plugged into my dock. When I tried Nuvola it refused to send the sound to the dock sound card and was not even showing up on the list of apps using the sound card.

TL;DR: apt-get install libasound2-plugins:i386

I will go into some more detail now. Nuvola is quite new and used the GTK3 toolkit. Flash is quite old and uses the GTK2 tool kit and we are waiting on Adobe to update it. So in the mean time we have to make them play nice together. To do this we use nspluginwrapper which runs flash in it's own process space. There is a howto on the Nuvola site for installing a compatible version of flash and that should at least get you flash running and some sound coming out.

Next up is the problem that this wraps an i386 version of flash and if you are running on amd64 like I am then it needs some extra libraries to make it work. To do that you need to setup multiarch support. Simply run dpkg --add-architecture i386 and run apt-get update as root. You can then install the asound plugins for i386 so that the Flash plugin can talk to pulseaudio. Run apt-get install libasound2-plugins:i386. Restart Nuvola and you should be good to go.


Bike chain and Chain-L No5 I have been trying a few different chain oils recently. I thought that I would give Green Oil's White Super Dry Chain Wax a try but it kept going squeaky after a couple of days. So I went back to the standard green oil which is pretty good. Then I read a blog post and one thing lead to another and I am sending of for a sample of Chain-L No 5.

Chain-L No5 sounds like it's the perfect bike chain oil. Drop some on then do nothing more for the next 1000 miles! It's quite sticky when you pour it out but smells like proper engineering. I will report back on how it goes.

Before I started all this I need to clean the chain a bit. I decided to do this off the bike for a change. SRAM powelinks are really neat, I had never seen them before but it makes taking the chain off a doddle. The chain was then dipped into some white spirit in a tub and give a good shake. I made a makeshift filter out of kitchen towel to clean the white spirit so I could use it again and again ( how cheap am I? ) and it was sparkling in no time. Then I left it out in the sun to dry for a while.

Applying the oil was even simpler. A drop on each junction and allow it to seep in. Then wipe off the excess and you are done. The chain felt smooth and the oil left a sort of film on the outside just like they said it would. Time will tell how it goes but if I can really just rub it down now and again for the next 1000 miles that will be great.

Bike floss Cleaning in the sprockets next and it was over to the bike floss. They are essentially oversized pipe cleaners with a mixture of soft floss and harder bristles. You pull it between the cogs and the dirt just lifts off.

Sparkling bike chain Sprockets Worked a treat as I think you can see from this really bad photo. I would recommend them to anybody. There was quite a bit of dirt in there and I had started with a screwdriver picking lumps out but the floss was much easier and more effective.

Posted Sat Sep 8 22:42:06 2012 Tags:

I discovered the puppet define command the other day and thought it would be useful to be able to access that from within vim so I created a little function to do it. Thought it might be useful for others so I have noted it here.

Just drop this into ~/.vim/ftplugin/puppet.vim and you should be good to go. Typing \pt ( ,puppet,type ) when over a puppet type should pop up a definition.

Edit: Now available to download from github.

" Takes some settings
" g:puppet_command The location of the puppet command
if !exists('g:puppet_command')
  let g:puppet_command = 'puppet'
" g:puppet_doc_widown - The type of puppet doc window to open
if !exists('g:puppet_doc_window')
  let g:puppet_doc_window = 'split'


" Define a puppet type
function ?PuppetDefineType(fname)
  " Show the puppet definition of the current puppet type that the cursor is
  " sitting on.

  " Create a window with a new __doc__ buffer
  if bufnr("__doc__") >0
      execute g:puppet_doc_window.'| b __doc__'
      execute g:puppet_doc_window.' __doc__'
  " Make this buffer disposable
  setlocal noswapfile
  set buftype=nofile
  setlocal modifiable
  " Cleare the buffer
  normal ggdG
  "Read in the new description
  execute 'silent read !'.g:puppet_command.' describe '.fname
  " Go back to the top
  normal 1G
  setlocal nomodified
  set filetype=rst


" Setup two commands to define the word under the cursor
map <buffer> <leader>pt :call ?PuppetDefineType('<C-R><C-W>')<CR>
map <buffer> <leader>pT :call ?PuppetDefineType('<C-R><C-A>')<CR>

Things that may improve this are the ability to run the command from anywhere in the file and have it find the nearest type and referencing local types. Let me know if you find it useful and I may ask for it to be added to Rodjek's puppet vim module


Posted Tue Apr 24 08:34:22 2012 Tags:

We are planning some software upgrades to our main website and as in previous iterations I started to write some tests. Just some basic stuff to make so that the config that we have set up does what we think it should do. I am not testing the applications or content yet but hopefully some of what we have learnt can help others to do that.

Test result trends

In the past I have written some unit tests in python that where simple and ran from the command line. This time we thought that it would be good to make these test slightly more permanent.

So I picked some tools as detailed below and we set about working out a set of tests. For a start we have just trawled through the long and historical Apache config picking out bits that look important and just writing a feature header. Now we are going back and putting in the details of all the tests. This is a great start and it's almost fun to see the graph of successful tests go slowly blue. (Jenkins seems to like blue not green for test results but there is a plugin to change that if you want. There is a plugin for everything!)

The aim of creating these test is obviously to give more confidence when we move over to the new server but I also hope that we can move to a different way of managing configuration changes. Moving to a state where we write the test first and then the config should be a much more relaxed situation. There is also the ability to raise alerts into Nagios if we require it. So if we see some security tests fail we could raise a Nagios alert and quickly take action.

Writing tests has also forced us to go back through the old configs and ask some questions about what each section is trying to achieve. I have already removed one section that is no longer relevant and have earmarked a couple more for after the upgrade.

We are still adding tests to the features that we defined in the first run through and that will give us a good base to be going on with but I am sure that there are more that we can add that are not directly related to the Apache config. Testing all the redirect rules is a massive job and we will need to automate it at least for the initial set up. When we have something that is useful I would like to hand the management over to the web authors for the most part.

Next up is to run puppet-lint over the puppet config from jenkins and shame us all into fixing our style and then to start writing some puppet-cucumber test! I already have it sending test results to our IM server for maximum annoyance"


I will go through some of the technical bits of the tools for those that are not familiar with them but there is better information on their respective websites.


Cucumber is a BDD[Build Driven Development] tool that allows you to write feature definitions in English and then parse them to create real test. You right out a feature and then some scenarios that would test a feature in a form of English, or your local language, called Gherkin. The resulting file looks a bit like this

Feature: The web server should have a standard set of error pages

  Scenario: Requesting a non-existent page should give the code 404 and the default page
    When I visit /notarealpage.html
    Then I should get a response code 404
    And the page should contain the content "University of York"

The first word of each of those lines is a keyword. Scenario, When, Then, And are all used by Gherkin to know how to interpret the line. You then write some ruby code that can deal with each of the lines. So for the When line you may write.

When /I visit (.*)/ |page|
  @url = "http://#{@host}/#{page}"
  @response = fetchpage(@url)

Then /I should get a response code (\d+)/ |responsecode|
  @response.code.should == responsecode

Cucumber adds some extra keywords to work with gherkin templates but basically it's just ruby. The first part builds a full url using the '@host' variable that we have set previously in the environment and the page that comes from Gerkin line. The second part matches the response code check and use RSpec's should feature to test it. I am assuming that we have already defined the 'fetchpage' function.

As you can probably see now I can write easily read tests in English for all my error pages without having to add any more code.

These are some really basic examples and I would advise you to go and read the site at http://cuckes.info/ to get a real feel for the full features of cucumber.


Jenkins is a fork of Hudson. Hudson is a fork of Jenkins. They are both a Continuous Integration(CI) servers and separated by politics. Jenkins seems to be the most active and supported. CI is a workflow that means as code is checked into a repository, it is built and tested automatically and the reports, from that code change, are alerted on and stored. I am just using it for the nice web gui and its support for running test regularly and interpreting the results. It has plugins for supporting all types of test and builds from Maven and ant scripts to test coverage, parser warnings and lint packages like jslint or puppet-lint. Again go and read more at http://jenkins-ci.org/. Running cucumber with the command '''cucumber -format junit --out logs/''' will create a load of junit format xml files that jenkins can understand and show you pretty graphs, and details of errors.

Posted Fri Apr 13 14:33:49 2012 Tags:

This morning Squidward decided that he was going to move out from Bikini Bottom and away from this madness so he phoned the estate agent and she said it would be no problem to sell his house as long as his neighbours where okay. "Oh dear" thought Squidward. So he decided on a plan; to tell Spongebob that it was opposite day so he was going to act crazy all day and Spongebob should play the straight man. It worked for a bit but as usual Spongebob had to take things to far and when the estate agent turned up hilarity ensued. Suffice to say that Squidward will never leave Bikini Bottom. I then left for the conference.

Last night was the conference dinner. Google had subsidised some of the meal which is nice of them and we where booked into the Cave, under South Bridge with the cows bottom sticking out the wall. Meal was good and the talk by Charles Yarnold was great. I think he has one of those geek dream jobs hacking crazy inventions for Gadget Geeks on SKY.

First up this morning was Simon Phipps talking about how the OSI is adjusting itself to the changing climate of opensource. They are moving from a board of 10 members to more of an open organisation with many members. Currently taking in open projects like Plone, KDE, Mozilla, Apache, Creative Commons and Debian but also hoping to accept personal and corporate memberships in the near future. Also starting to act a guiding voice for business and government to try and skew the conversation back from the corporations that are trying to preserve their older business models at all costs.

http://openrightsgroup.org/ - They are one of the few voices that are speaking to government in defence of public rights. If you think their language is too strong then you can only imagine what is being said behind closed doors on the other side of the argument.

Tariq Rashid gave a great talk about all the work that is being done inside government to level the playing field when it comes to opensource and standards inside government. They are laying down the law, quite literally in some cases, about how open software should be considered for all purchases. Whilst his language was obviously designed to appeal to political and corporate interests I can't help feel that the constant focus on the free as in beer angle is not all that needs to be stressed. I think he has a massive job though so whatever works I suppose.

A quick break and some more fluids and it was back in to a talk on OpenAFS. I always knew it was cool and even had a go at setting it up once but I think it's a large investment. Interesting to see that he said that the move to git for version control has really upped the involvement levels and what is quite an old distributed filesystem is looking very suitable for modern requirements. They are adding mobile clients and a web interface is working right now.

A small violin performance before a talk on ldap performance from Howard Chu. Always impressed with people who can actually play and in front of an audience. The improvements to openLDAP by ditching all their caching layers and just going for a memory mapped storage database where impressive not only from the massive improvements in speed but also ditching a lot of the need for tuning. Reminded me of some of the stuff I was reading about Varnish cache. Looks like that it the LDAP server to look at. He also said that he ported the database backend to sqlite among other things and it made a huge improvement there too.

After lunch and some time in the sun it was a talk about Rudder which is another config management tool but this time it's all about simplicity of use. There are still configs that you apply to nodes but this config can be done in a nice web front end and the software builds CFEngine templates that are automatically pushed to the correct clients. It is quite a complete system allowing with packages for the clients logging and monitoring all built in.

Another configuration talk now and I must admit I was flagging a bit now which was a shame because this was quite an interesting concept. We currently provision servers by stating a configuration that is supposed to be running on them but with no concept of any sort state. I can change a server to turn off a service on one side and then bring it up somewhere else but if I want to manage the loadbalancer to make sure that the service is never interrupted then I have to do that manually. Herry Herry's concept was to add some rules that allow you to say things like: Don't turn off that service until the other one is up and the loadbalancer has switched over. He was concentrating on creating a language to encapsulate your rules and a compiler to create a set of steps required to make things work in the right order. It is just a test tool right now but I can see it being a great addition to puppet.

Then the Ligtning talks, just 5 mins so just one line...

Tech Talk PSE - HTML based presentation tool, that can include shell.

pairvm - drbd based server centre fail over for kvm. ( might want to look at ganeti)

time travel for linux - systemTap script to lie to the application about the date being 2038 to test what happens at the end of a pension.

cisco firewall issue - SAC packets id rewritten by firewall - tcpdump rewrites the numbers again.

Machination app store for machine config: users can select apps for their servers.

postgresql update - power management DB sleep when idle and performance is much faster. 9.2 in september.

virtulization concidred harmfull. - we don't need VM's just run Unix processes. Sort of an anti-talk about everything being said!

Samba 4 update - Full AD support is there and now. CIFS clustering. WAN proxy. Preview for release in April.

Your judgment sucks - Humans are buggy as well.

That's it. I am off to the doctor

Posted Thu Mar 22 17:53:02 2012 Tags:

First day at #flossuk Spring Conference and there is a lot of talk on configuration management, logs, monitoring and devops.

Matthew Bloch of ByteMark gave a great insight into their BigV visualization with lots of interesting details. I think the most surprising thing was that they where using nbd for all their disks. Really simple way to get cheap VM migration and so far really reliable.

Kris Buytaert's 7 Tools of ?DevOps talk gave all the usual CI, monitoring, config management tools but also was keen to stress how important the whole talking and interacting part was. Devops is not about one tool set but more about making sure the communication happens. A lot of the ?DevOps tools are there to help start these conversations. Being able to talk to people about graphs and changes in tangible ways is really useful.

I heard some good reports about the asciidoc talk. Anything that makes documentation easier has to be good. My current thoughts would be to move more of our documentation into the puppet config as Markdown files and api docs. Then run a tool against the repo on check-in to create a web tree.

The talk by Bernd Erk about Icinga, a fork of Nagios, was quite interesting. Nice to see people developing Nagios. They have dome some crazy things like using LDAP to populate the config and adding graphing to the monitors. I think that personally it would make more sense to move this sort of thing into our puppet config but maybe having a node assigned with services in an LDAP tree could be used to provision it with puppet and then also test that service with nagios. As for the graphing I think that feeding those checks out into Graphite is still my preferred option. And I think Patrick Debois has some interesting suggestions for linking tools like graphite, collected, nagios and Ganglia together

Some other small themes that seem to keep cropping up. Ruby is what the cool kids are using. Java is not getting much cheerleading but is used a lot. Chef and puppet seem to be being used or considered by a lot of people.


Posted Wed Mar 21 17:44:30 2012 Tags:

Post interval: graph

Posts per month this year: graph

blog comments powered by Disqus