Grampian Information Conference 2014 – Buzzword Bingo

On Thursday 6th November I’ll be giving a talk at the Grampian Information Conference.

During my talk we’ll be playing a game of Buzzword Bingo.

Below are two bingo cards which you can download and print.

Please pick one, print it and take it with you.

If you complete a line shout out ‘House’ and if you’re first you’ll win a small prize.

m4s0n501

A Scotland-wide FOI search facility?

Updated (*) 17/1014 2 new bullet points at the end.

At the first Code The City event, held in Aberdeen in June 2014, one of the projects which was worked on for the weekend was the setting up of a search facility for Scottish Local Authorities’ FOI disclosure logs.

People at desks2

People at desks

Iain Learmonth, Johnny McKenzie and Tom Jones created a scraper for the disclosure logs of both Aberdeen and East Lothian councils.  This scraped the titles and reference numbers of FOI responses, and, if I recall correctly, processed the data as Linked Data at the back end, and put them into a MySQL database. They then put a web search front end onto it . A demo site FOIAWiki was set up for a while but that was then taken down.

Iain has now made his code available on Github.

That wasn’t all we wanted to do – so I hope that at the second Code The City event, which takes place in Aberdeen this weekend (18-19 Oct) we can revisit this and go a bit further.

One way we could do this is to broaden out the number of councils that we scrape; and the second would be to go deeper, scraping the content of the actual PDFs which contain the detailed responses themselves, and make that searchable, too.

There are some hurdles to be overcome – and I’ve been working on those over the last week.

The first is that only a small proportion of Scottish councils actually publish a disclosure log. According to some research I carried out last last weekend, these account for only 8 councils (25% of the total). Amongst those, only two of Scotland’s seven cities actually had discoverable disclosure logs. I’ve posted a list of them here. And, of course, the quality of what they provide – and the formats in which they present their data- vary significantly.

The second issue is that about half of those with disclosure logs post their responses in PDFs. These files contain scanned images. So the text can’t be easily extracted. However there are tools which, when used in combination, can yield good results.

In preparation for the next Code The City session I spent two days and a couple of evenings experimenting with different options. Keeping with Iain and Tom’s original approach, and bearing in mind my own slender coding abilities, I looked for Python-based solutions.

After several abortive attempts I settled on this solution. It uses Tesseract and PyPDF (the original site for which appears to have disappeared – but this one works).

In setting up my environment I had several difficulties in getting Tesseract and its dependencies installed. I’d missed that you could, on Ubuntu-based systems use the command:

"sudo apt-get install tesseract-ocr"

(as I’d being trying “sudo apt-get install tesseract” (ie without the -ocr) and it had failed.

Instead, I’d tried downloading the source and compiling that according to these instructions.
(HINT – try at all costs to avoid this option if there is the possibility to use a package manager to do the job). The upshot was that running multipage-ocr.py failed every time due to Tesseract and its many dependencies not being installed properly.

Trying to fix it by installing the pre-packaged option, then trying remove it and reinstall via the apt-get package manager failed several times and took hours to get rid of the manual install and start with a clean installation. As soon as I did the code worked.

So, running it from the command line thus:

python multipage-ocr -i abcde.pdf -o output.txt

generated a high quality txt file which had OCRed the PDF. You can see the input file here and the output file here. As you will see – the quality of the output is very good.

Notebook and laptop

Notes for the weekend

So, for the weekend ahead, here are some things I would like for us to look at:

  • amending the scraping to work over all eight sites if we can – and look at how this affects the original information request ontology from weekend one.
  • grabbing the PDFs and storing them to a temporary location for conversion. The process is quite slow to run and processing a 2 or 3 page document can take some time.
  • running the script, not from the command line but using os.system() or similar, passing the parameters
  • limiting the OCR to work on only the first two or three pages (max). Note: I saw one Aberdeen City Council PDF which ran to 120 pages! Beyond the first couple of pages the content tends to be of less relevance to the original query – and forms background in some way as far as I can see.
  • convert the multipage-ocr script to not (just) output to a txt file, but to grab the text as a bag of words, remove stop words, and store the output in a searchable DB field – associated with the original web pages / links.
  • set up some hosting with all database, python scripts with dependencies and a web front end onto it.
  • Further options would be to look at for standard structure in the documents and use that to our advantage – and to set up an API onto the gathered data……)
  • (* update)  – we need to track which FOI responses have already been indexed and only schedule indexing of new ones, and
  • ww could use What Do They Know RSS feeds for missing councils https://www.whatdotheyknow.com/body/glasgow_city_council (although this would not be a full set).

And probably a whole pile of stuff that I’ve not yet thought of.

I’m looking forward to seeing how this comes along!

Ian

Scraping Twitter data with Morph.io

At work we operate a number of Twitter Accounts, and we have been monitoring the number followers each has, on a monthly basis, for a number of years.

The collection of the data – which we normally put into a spreadsheet – has been, up to now, done manually which is, to be frank, a pain.

So, I decided a while ago to set up a scraper for it. I’ve only now got around to doing it – and it has taken me longer than I thought it would.

You can see it here https://morph.io/watty62/Count_twitter_followers although I might clone it from my personal account to a work one. The beauty of using Morph.io is not just that it does the scraping for you, but that you can also pull the data out using an API.

Morph runs using files on Github, so the Scraper.Py file is in my Github account. I’ve added comments throughout the code, so it should be self-explanatory.

Also there you will find some other files. including the ReadMe.md file which I’ve put together , and the followers.csv file which contains an export of the data from our original spreadsheet.

The Readme.Md explains how the code works and how it writes to Morph.io’s SQLite database.In putting together this project I’ve become quite a fan of SQLite and its elegant simplicity (even if I had to change to use Scraperwiki version for Morph),

And while the follower.csv file is not required by the code or Morph, it is there so you can see the data that I import in the first section of the scraper.

The scraper if set to run for the first time live on Monday 1st Sept, and monthly thereafter. I’ve tested it as much as possible but will watch nervously on Monday to see if it works as planned!

Hopefully you will find a use for a modified version of the scraper too!

When is a webteam not a webteam? And where do you put it?

The team which I manage for our organisation has a very mixed portfolio, but it all hangs together well.

We have smart cards, Geographic Information Services, the Corporate Gazetteer, various aspects of operating and hosting a portfolio of web sites, domain name management, a lead on Social Media, Open Data, user-centred service design – all of which comes under the broad category of Digital Services (although we don’t currently call ourselves that). We still use the now rather dated term: e-Government.

And we operate the biggest Customer Service channel.

We haven’t had a web team, as such since 2006.

Our move from web team to being the providers of a broader suite of services began in 2003 when I put forward a business case to integrate the management of web content (then under Comms) with the running of the technical side, which was then my role. And when that happened we were one of the first organisations of our kind to implement that approach, certainly in Scotland. Many have since gone down the same route.

That integration brought huge dividends. We had one strategy, one approach, one set of objectives and we pooled resources rather than spend our time negotiating with other teams (or worse).

And we built on that core team – bringing in other services, becoming e-Government, moving from what had been the provision of an informational site (with the content run by Comms) to focus on initial online service delivery projects – with our first big launches in Summer 2004.

Prior to that no-one in the Senior Management Team seemed to know what to do with us. Up to 2003, if memory serves me right, our team had been based in ICT (with half in Comms), HR, the Office of The Chief Executive, Planning, Property, and back again to either HR or OCE – and we were later to find ourselves in Continuous Improvement and Corporate Governance. So we’ve been about a bit!

Since 2004, we’ve increased our focus on online service delivery – although that took a bit of a dip in 2006 when we landed with a manager who just didn’t ‘get’ web or self-service. And our staffing got cut at that point and has never recovered.

When we restructured again in 2010 we were aligned with Customer Services – as we sought to do – and we work closely with that team to address all three service delivery channels. That works really well and means that not only have an excellent relationship (with the teams co-located in the same physical space) but we also align strategy to a common aim.

Now we find ourselves caught up in another restructure. Once which will see a new Head of Communications role created and filled. This is a welcome move and one which is badly needed.

As part of the proposed restructure it was suggested that Web Content move to that role. We responded to this to explain more about what our team does – and how working with the Head of Comms role to provide a communications platform for a comms team might work – and how certain aspects of our team role, such a a lead on Social Media use, might transfer to that new function (noting that our current work in relation to the use of Social Media is focussed on integrating that into the Customer Service / Contact Centre in coming weeks). We also highlighted the wide range of functions we perform, how we work with other teams, explained that there is no longer a stand-alone web content function (as that is at the service of fulfilling customer service) and so on.

Now spectre of moving the whole team to Communications has materialised and has caused a degree of surprise.  I think it is fair to say that this would be by most measures a retrograde step, and one which would put us at odds with just about every other local authority in the UK, some of whom have been seeking to emulate our long-standing integrated function, and also to align all three delivery channels under single manangent team as we have. It would take us back to pre-2010 in terms of our joined-up approach to Customer Service – and in fact to pre-2006 structurally.

We await further discussion on this – and the chance to explore the risks and opportunities such a move might bring.  I’ve certainly been doing a lot of soul-searching on this. Perhaps I am wrong. Maybe we should move away from having a website that is the primary service delivery channel. How that would support an approach to Customer Service Excellence – which we’re tasked with delivering I can’t quite work out yet!

Meantime, if you work in digital services, web, or customer service in Local Government I would welcome your comments below please!

 

 

What next for CodeTheCity?

We kicked off CodeTheCity with our first Aberdeen event on 21st – 22nd June 2014 which really couldn’t have been better.

We had some 38 volunteers – service users, service owners, coders, designers, bloggers – all of whom turned up and give around 700 hours of their time over the weekend to eight civic hacking projects. At least three of these are sufficiently-well developed that they can be taken forward as live ‘things’ – and the others have at least sown the seeds of further challenges and may go further.

CodeTheCity Logo

CodeTheCity Logo

You can find much more here:




But where do we go next and what do we do?

Well, I’ve had a chat with Steve Milne. He’s not just one of the four main collaborators behind this endeavour, he’s also the driving force behind the name, the concept, the site and much more.

He and I are pretty much agreed on most of how we take this forward. So here are my thoughts.

The concept of CodeTheCity is not limited to Aberdeen. We’ve already had interest from Edinburgh and Helsinki – and it is generic enough to work anywhere. To that end, Steve has drafted a manifesto and a guide to running a CodeTheCity weekend. He’d like your feedback on those.

While, I’m keen to see what happens further afield – and how we can support and guide that where necessary, I’m also determined to ensure that we keep it working in Aberdeen – using our first outing as a springboard for future activities, events, and engagement between the different communities, organisations, and individuals who supported us.

The drive still needs to be bottom-up (rather than led by official organisations) with service users, and some service owners, bringing us challenges and problems to work on. That said, we do need, and recognize, the support from the local authority, academia, SMEs, and larger international organisations, all of whom have instantly recognises that there is a model here which is worth supporting.

Working with local groups

We had representatives of some Aberdeen community centres there at the June 2014 event, and some sports staff, some council employees from the translation service and so on.

Next time we need broader participation – and having had conversations with each of those groups in the last week, I’m confident that their enthusiasm for the format and approach and keenness to keep it going will pull others in.

There are other models that we can look at and learn from – such as Apps For Amsterdam, and Appsterdam who have broad community bases.

We also need to work with my employers – Aberdeen City Council(ACC)  – and look at some fundamentals, such as how data is gathered, stored and made available for re-use. Doing this right would not only generate much more open data, allowing greater sharing and re-use, but would also benefit end users, and make it easier for CodeTheCity participants by avoiding some of the issues we had around scraping sport timetable data or FOI data from closed systems.

We want open data to be at the heart of most of what we do. This presents opportunities to decentralise some of the maintenance of data (eg for Community Contacts) and rather than rely solely in Library staff, in this instance, look at how some of that could be looked after by the groups whose data it is.

This ties neatly in to Aberdeen City Council’s participation in Code For Europe(C4E), which I’m leading, and on which Andrew Sage is our Code Fellow. Part of what we’re doing locally with CodeTheCity is providing small-scale test beds for projects that could be scaled up or broadened out with C4E support.

Both have at their heart the need for data – clean, open, maintained, accurate, timely – and also for user involvement in developing web applications and associated services.

Next steps

So, in summary, as I see it at the moment, the future for CodeTheCity in Aberdeen is rosy.The energy at the June event, and the number of subsequent conversation, expressions of commitment, and a common desire to keep doing more is infinitely greater than I’ve experienced in association with any previous hack weekend.

So we can expect

  • more events – at least one big-scale one in Autumn 2014, with potential smaller informal or linked ones meantime,
  • support from ACC’s CodeForEurope programme in further developing some of the infrastructure requirements (eg for an Open Data repository, use of Open311, CitySDK) and sourcing and making available many new datasets,
  • a growing awareness and buy-in from ACC senior managers that the CodeTheCity approach is one that works – by getting service users, providers, coders and others together to identify, workshop and design new solutions.

I, for one, am excited and energised by how positive the future CodeTheCity looks.

If you attended the June event and enjoyed it – tell your friends. If you missed it but like the sound of it, get in touch. Get on our mailing list and we’ll make sure that you know when we announce the next activities.

Ian

 

 

 

 

 

 

 

 

 

 

MatchTheCity Project at CodeTheCity

A quick update from Code The City.

I’m working with the #MatchTheCity project team. Our aim is to provide some of the data and infrastructure to support other projects – such as #BigSociety.

I’ve been working with Dave Morrison, the Code Fellow for East Lothian who came up for the weekend, on liberating some seriously closed data and making it open.

Meantime Andrew Sage, Aberdeen City’s Code Fellow, has been working on processing the data and making a data model to hold it all – and offer it up online for the other projects.

 

SAMSUNG CSCDave and I have scraped and transcribed

  • The current Sport Aberdeen Timetable which is a very odd piece of work. Not open, has to be forced to show all days events, uses javascript to reveal rows and uses jpegs to denote the day of the week, rather than text. Dave did a great job of scraping this.
  • The Aquatics Centre timetable. This is a PDF – so not open, needs a ruler and a print-off to work out the start and end of session times, (couldn’t be read by a screen reader),isn’t even scrapable so it had to be hand transcribed. Also service disruptions are marked in the HTML page above rather separate to the PDF.
  • The Exercise Classes at Aberdeen Sports Village. This was easier to scrape semi automatically, but could be much better.

At present, there is no way to search across these – eg if you just want to find out where you can lane-swim tonight, or establish what family sports opportunities are available next Saturday.

We also created a json file of venues’ details.

Andrew then create this in Heroku. And everything is on Github here.

The scraping we’ve done will provide a static snapshot. We need to work with Aberdeen City Council’s Education Culture and Sport directorate to work out how we get open data sources to be set up and maintained for these, so we can offer back the data for integration into the website.

Iain Learmonth has pulled most of these data into Linked Data format (all in 20 minutes).

Dave is now working on mapping this. Almost done.

Rowan Evenstar helped with some styling of the page.

Show and tell at 3pm.

Update: the styling and mapping won’t work on the Heroku-hosted site, while both work on local copies. This is no biggie. The heroku site was created just to visualise what we’d got into the back-end system, and would not be a longterm solution to viewing it anyway.

Oh, and #MatchTheCity won the popular vote after the show and tell, which was nice!

Ian

 

Scraping Tools – A quick round-up

I’ve written here before about using Scraperwiki to scrape content from websites which haven’t implemented OpenData.

I have even used Scraper Wiki to scrape our own website to get badly-formed content out in a structured way for a hack.

Now, sadly, Scraperwiki is no longer free, and old scrapers are mostly frozen. So if you are about to embark on a hackathon you might be looking for an alternative.

I’ve noticed four recently – but have yet to test these.

  • Morph a web-based tool from OpenAustralia. Write scrapers in Ruby, Python or PHP
  • Portia: released yesterday(!) – available via GitHub, soon to be made available as a hosted service
  • Import.IO – A free web-hosted service that promises further (maybe paid for) features. Looks like a great Help section with support, webinars etc.

Have you used any of these? Leave a comment with your experiences. Or let me know of other alternatives!

Ian

Code For Europe (Scotland)

As I wrote in my previous post, Aberdeen City Council have joined Code for Europe 2014.

Code For Europe Badge

Code For Europe Badge

Following the international meeting in Barcelona, and ahead of the appointment of the code fellows, the four participating Scottish councils came together in Edinburgh this week to compare where we are and what our next steps will be.

As part of that meeting Katalin Gallyas of Amsterdam city municipality addressed the group on that city’s experiences. She spoke of the development of the apps mentioned in my previous post, but also covered a number of other issues. These included tools they use, such as Open Data Kit, and the choice of Open Data platform. “Beware the big corporates offering expensive, proprietary platforms”. Amsterdam chose CKAN an open source one. This is a choice that the Scottish authorities need to make – and perhaps an opportunity for something on which we could collaborate.

She also covered the extensive and enviable Open Data ecosystem that exists in Amsterdam. This involved

  • SMEs, hackers, coders,
  • Business accelerators
  • Innovation intermediaries
  • EU projects and funding
  • An open-minded city government
  • Participatory citizens

In 2014 Amsterdam are seeking national and EU funding – and they have committed to having

Badge with the slogan I Amsterdam

I Amsterdam

4 fulltime employees who amongst other duties will be collecting evidence and use-cases on reusable Apps.

All of this activity and engagement is bringing the state-of-the-art into city hall, shifting from traditional in-house one-off developments (that we see all over the place) to the use of tools such as Tableau or GitHub, allied to the adoption of and growing familiarity with concepts such as Commons / Standards, civic coders, hackers, grassroots tech.

She suggested that in following such a model Scotland needs to be looking for sources of external funding – such as Horizon 2020, but also at growing entrepreneurship, and developing a local Open Data ecosystem.

We also need to look at Apps For Amsterdam, and Appsterdam which has loads of activities and even its own ‘Mayor': Mike Lee.

All of this activity acts as a real Open Data catalyst. It exposes the need for a vocabulary match between the policy makers and the civic app space, generates clear use cases for the data. And it highlights the need for the adoption of commons and opens standards by local authorities.

Suzanne leads a workshop

Suzanne leads a workshop

Following Katalin’s presentation there followed some highly energetic road-mapping of the Scottish programme which was led by Suzaanne of the Waag Society, of whom I wrote in my previous post. This generate some positive twitter comments:

I’ll return to this topic in future posts as the Scottish C4E programme gets underway.

Code for Europe comes to Aberdeen via Barcelona

Recently Aberdeen City Council has accepted an invitation (via NESTA)  to join the Code for Europe movement. This post sets out some of the background to that. It covers the content and outcomes of the first meeting of code fellows in Barcelona at the end of February 2014.

ESADE Buildings

ESADE Buildings

A following post will cover a subsequent meeting of this year’s Scottish Local Authority participants which took place a week later.

What is Code for Europe?

Code for Europe (C4E) “strives to solve local civic challenges, by enabling agile temporary teams of developers to create solutions that are easily reuseable in other European cities.based on Code.” It is based on the original Code for America model, and sees Code Fellows, who retain their independence (rather than being contractors) being embedded in City Hall. This means that they work alongside the people who know the problem or have access to the data needed to power the solution. By that process it aims to bring positive culture change in the city hall.
The C4E movement was established at the start of 2013, and involved six cities: Barcelona, Helsinki, Amsterdam, Berlin, Rome and Manchester. the aim was to deliver web or mobile apps which are taken up by the local authority – not left to wither as the might do after a hackathon, for example.

 

2014, the second year, brings new entrants

NESTA have previously sponsored the Make It Local programme in Scotland which delivered four open data-driven sites including Edinburgh Outdoors and SmartJourney.
Now NESTA have created a programme which will see four Scottish Local Authorities join the C4E movement: Aberdeen, Edinburgh, East Lothian, and Clackmananshire.  As part of this programme, there will be a tiered mentoring scheme in place. Helsinki, who bring their experience of C4E 2013 will mentor Aberdeen. And Aberdeen who participated in Make It Local 2013 will, in turn, mentor Clackmananshire. In the same way, Amsterdam will mentor Edinburgh who will mentor East Lothian.
Additionally, all cities who participated in C4E 2013 are remaining in the current year. Most are refreshing their roster of fellows – or augmenting them.

The Barcelona Get-together Feb 2014

Student Accommodation at ESADE

Student Accommodation at ESADE

On the last two days of February 2014, the first international meeting of code fellows for participating cities took place at the ESADE Business School outside Barcelona. Code fellows and representatives of various participating bodies attended. These included Helsinki, Aberdeen, Berlin, Amsterdam, Barcelona, East Lothian, along with attendees from NESTA (UK and Scotland), EU Commons, ESADE business school, and independent developers.

The two days allowed for presentations, panel discussions, workshops and simulated app development. It allowed code fellows who are already in place for this year (not all are) to to start to network.
The eGarage Space at ESADE

The eGarage Space at ESADE

The Presentations

  • We heard about the development of Samensapp (which like all C4E projects is registered on the Europe Commons site with its code on Github). This was created after an area in Amsterdam was identified with significant social problems, The Code Fellow worked not only with the local government there but also with local citizens to first identify the problems, then to look at how technology could address a specific societal need – in this case allowing people with common interest to band together and to book underused rooms in community centres which traditionally could not have been booked online. Some community members were taught how to maintain and improve the code.
  • We learned about Take A Hike which was developed when it was identified that only 4% of Amsterdam is visited by tourists and much of it is congested for locals and visitors. So this app was developed to create trails, and incentivise tourists to explore further afield. Like all C4E developments, the source code is open and the system could easily be re-purposed for any other locality.
  • We heard from Llluis Esquerda, a civic hacker about his frustrating journey to liberate data about the availability of shared bikes in most major cities of the world. This involved cease and desist letters, and inconsistencies and frustrations from City Halls all over the EU and beyond. His experiences were all the more powerful for being presented unvarnished!
  • Timo Tuominen of Helsinki and Jan-Christoper Pien of Berlin are both new code fellows. They spoke of their prior experiences and also of the process they are going through to identify potential projects to work on in each of their host cities.
  • Sergio Diaz from Barcelona described a project which he is working on which is an offshoot of the local Bottom Up Broadband programme. This will see citizens building and hosting Arduino-based environmental monitoring stations which will capture data locally but make it available as open data as a city-wide grid of sensors forming a Smart Citizens platform / toolkit.
  • Haidee Bell and Paul Mackay, both of NESTA in London spoke of a number of things:
  1. The history of Europe Commons which presents a searchable database of city apps created as part of C4E and other initiatives. The future version of that platform will see it list more details of each project: what the problem was, what the solution was, which data sets are underpinning it, and where it has been re-deployed, and so show how the initial investment has been capitalised on.
  2. Nuams, Open Civics, API Commons (a place to publish your API specs),
  3. the City Software Development Kit (SDK),
  4. Open 311 and
  5. the emerging General Transport Feed Specification (GTFS).

In this short segment there were some real nuggets that I want to check out, discuss with developers and hackers and look at we support and contribute to these in Aberdeen, Scotland and the rest of the UK’s local Government space.

  • Ruth Watson of NESTA in Scotland, who had arranged the whole trip for the Scottish contingent, opened a session on Make it Local programme and I presented on both SmartJourney and Edinburgh Outdoors (all mentioned above)
  • Prof Esteve Almirall in the ESADE Lecture Theatre

    Prof Esteve Almirall in the ESADE Lecture Theatre

    We were given more in-depth information on the history of Code For Europe and its origins with backing from the World Bank. This was presented by Pro Esteve Almirall of the ESADE Business School. Esteve was instrumental in setting up the C4E programme. He also outlined possible future developments including the potential of a Global Commons built on the Europe Commons model. This could also include translation facility to the model.However none of this will happen until the conclusion of an ongoing World Bank restructure.

Designing a game with post-its and cards

Designing a game with post-its and cards

Suzanne Heerschop of the Waag Society (backers of C4E in the Netherlands, as NESTA are in the UK) led the final workshop sessions, designed to get follows and city representatives used to working in partnership on projects. In this we used the prompts from AddingPlay cards from PlayGen to steer development and make us think of challenges that my arise. This was a very fun and creative session. I must get myself a set of those cards.

All in all the two days sped past – with good company, engaged partcipation and stimulating ideas. And it set the Scottish contingent up for the Edinburgh session which was set for a week later and which will be the subject of my next blog post.

 

 

team Strike Demo Their Game

team Strike Demo Their Game

Top non-fiction reads of 2013

Four of my five top non-fiction books which I read this year are Python programming titles. This is unsurprising as I’ve spend a considerable amount of time teaching myself Python via books and MOOCs.

The titles below will take you from absolute beginner to having a firm grasp of how to build interactive, useful, modern applications of this versatile programming language.

The fifth title is a non-programming, which I reviewed back in June. While it is not about programming, it does touch on Open Data, data journalism and the need for new skills among citizens and journalists alike.



See larger image
Additional Images:

Learning Python

By (author): Mark Lutz

List Price: £41.99 GBP
New From: £23.30 GBP In Stock
Used from: £21.06 GBP In Stock



See larger image
Additional Images:

List Price: £9.99 GBP
New From: £4.73 GBP In Stock
Used from: £2.20 GBP In Stock



See larger image
Additional Images:

Python Cookbook

By (author): David Beazley, Brian K. Jones

List Price: £32.50 GBP
New From: £17.43 GBP In Stock
Used from: £19.91 GBP In Stock



See larger image
Additional Images:

List Price: £28.99 GBP
New From: £15.17 GBP In Stock
Used from: £14.97 GBP In Stock


Facts are Sacred

By (author): Simon Rogers

List Price: £20.00 GBP
New From: £10.58 GBP In Stock
Used from: £9.66 GBP In Stock