As a fairly routine part of my job I end up trying to get different sources of data to play nice. For example, today I’m assembling a set of data for a bunch of councils coming to a leadership academy. I have to take appeals data (from PINS), NI157 data (from DCLG) and some other stuff I’m generating.
Life would be simple if everyone started with a GSS code. These are the standard names for every council. You can then use a VLOOKUP or little database to get your data neatly lined up. If only. PINS say “City of York” and DCLG say “York City Council”.
It’s even worse when you ask a human which council they work for – as we do when people book up for our events. So here, in case it is useful for anyone in a similar position to me, is the list I’ve built over the years. It’s a long list of councils, spelled in many and various ways, mapped to the GSS codes. It works about 80-90% of the time, and when it doesn’t I add the new variety of spelling at the bottom.
It’s a simple job to see which council has the biggest variety of spellings. Step forward the “Borough Council of King’s Lynn & West Norfolk” (according to their website). You are listed 12 times, a few more than any other:
|Borough Council of Kings Lynn & West Norfolk
|Borough Council of King’s Lynn & West Norfolk
|Borough Council of King’s Lynn and West Norfolk
|Borough of King’s Lynn & West Norfolk
|King’s Lynn & West Norfolk Borough Council
|King’s Lynn & W. Norfolk
|Kings Lynn & West Norfolk
|Kings Lynn & West Norfolk BC
|Kings Lynn & West Norfolk Borough Council
|Kings Lynn and West Norfolk
|King’s Lynn and West Norfolk
|King’s Lynn and West Norfolk Borough Council
So, as a small geeky gift to the world here is an English council fuzzy match thing. Enjoy. Authority names and GSS codes etc 1.6
Until she moved on to better things, we used to beg favours from a friendly LG-Group GIS person to make maps for us (Zoe – you are missed). In theory, we still have arcview licenses somewhere but after at least two office moves and numerous changes in staff I don’t have the appetite to wrestle with the corporate IT people and get it re-installed and connected.
I decided to use open source software (QGIS), together with the newly opened Ordnance Survey data, to quickly hack together a few maps that use my local data as an attribute. The documentation I could find was either a bit worthy and slow, or expert and quick. If you half-know what you’re doing, and have a project in mind, these notes may help you make a map inside a day. Continue reading →
A little while ago, I was presenting some of my geeky stats to a room full of people. We were reviewing how variable the workload coming into a planning department is – much more than you’d imagine. My argument is that simple performance measures (we will validate 90% of applications on the same day) are pointless and act to demotivate people. Anyway. As it happened, a couple of authorities in the room were planning to combine their back offices. They asked a really basic question – “Would combining our workloads smooth out the peaks and troughs, or make them worse ?”. It’s an important question. At the time, we eyeballed it and reckoned that the answer was “no, it would probably make things worse”.
What with the formation of “super” authorities in the news recently, when another person asked me a similar question recently I thought I’d try to answer it properly. Like many simple-sounding questions it actually is quite tricky to pin down. This is my attempt, and given that I’m not a professional statto your comments and criticism is welcome. Let’s find out whether there is an economy of scale to planning applications. Continue reading →
After a fairly heavy set-up, this is a more relaxed look at some of the visualisations possible when you have benchmarked your data. This, if you like, is the pay-off for the hassle of aligning your data with the standards. Our first one is nicknamed “the factory gate”, and treats the application system as if it were a manufacturing plant. Continue reading →
To ease the burden of supporting our benchmarking project “managing excellent planning services” (aka MEPS) we have been experimenting with some stats toys. While this started out as a “howto” in excel, the idea of being welded to a mouse for the next few weeks did not appeal. This is the technical set-up, where I describe in some detail the tools and code used to make the graphs that benchmark performance. Following on from this I go through the five sets of visualisations that support MEPS. If you want to know how we made it, or if you are living in a local authority trying to find ways of tracking performance, this might just be useful. Continue reading →
As I mentioned a short while ago, we’re about to launch a new series. Originally our attempt to take the NPIP work forward, it has grown into a longer-term piece of support to help local authorities adapt not only to a (cross fingers) brief downturn in applications but what may be long-term pressure on operating budgets. Continue reading →
It’s possible to get a bit of a feel for the planning zeitgeist from some of the email requests we receive. Increasingly over the last months we’ve had requests for help from people wanting to understand their costs, their performance and whether they stack up in relation to their peers. We are currently working on a fairly chunky piece to do just this – and this post will help explain a little of what we’re up to and perhaps more importantly what we are *not* doing. I’ve covered some of this ground before, but my thinking seems so out of step with the world I’ll repeat it until someone explains what I’m missing. Continue reading →