MEPS is 10 months old now and we have 7 benchmarking groups made up of 30 authorities. As expected, this project gets more and more interesting as we add more sets of data into it. Its early days – I am not quite ready to share individual authorities’ improvement stories, but I can start sharing what we are learning about the planning beast more generally. Some of it confirms things we know, some makes us question some of the ‘received wisdom’ about what improves planning services, and other bits are frankly, leaving us scratching our heads…
What is clear is that our decision from the outset, not to lump authorities into league tables and endorse the quickest and cheapest, was a good one. We have gotten underneath things. Your service costs, performance, productivity and what’s important to you and your customers vary widely, even where you are of a similar type. This demonstrates how dangerous it would have been for us to use this project to suggest to authority x that they could save £ simply by becoming as cheap as authority y.
This variation and the level of detail we have has shown us things that the national indicators tend to hide, has made us ask questions of some of the national ‘improvement’ initiatives of recent years, and highlighted a few things that can’t be defended with a finger in the air or gut instinct. This stuff is relevant to all authorities. It should also be of interest to the Government as it prepares the next iteration of the planning system. Remember, this is all based on real data, albeit from a limited selection of authorities.
Lesson 1 – When I am busy, you’re busy
Weekly work levels across our first three groups of authorities peak and trough in almost identical patterns. If you think that sharing staff/back offices will improve productivity at busy times, you may have to reconsider. If what we are seeing continues to bear itself out, I might even be bold enough to say that sharing back offices is likely to make your problem worse. This is a head-scratcher at the moment– we aren’t sure why things are so unpredictable – ideas welcome.
Lesson 2 – Economies of scale only apply to plan making
Some authorities are able to process applications at a cheaper cost than the fee income. However, this only applies before all the other costs of providing a service are applied. The data suggests that it gets more expensive as the volume increases – a kind of reverse economy of scale.
Lesson 3 – A good NI157 score at best doesn’t tell you much, at worst hides the reality of your customer’s experience
If I said that only 1 in 12 authorities can demonstrate performance as what we would describe as ‘good’ would you believe me? NI157 also gives no incentive to authorities to improve their validation processes. When you add the validation and determination times together, most authorities are taking well over 8 weeks to determine applications.
Lesson 4 – Application quality
Nothing concrete on this yet, but we are noticing a few trends that are making us look more closely at the different ways authorities receive applications and the effect that can have on validity, withdrawal/refusal rates, and the overall length of time taken to determine them. Watch this space.
Lesson 5 – Service costs
We have detailed data across 30 different activities that make up the planning service. The cost of these component parts vary widely and is probably most marked in servicing planning committee. This cost variety is also seen in the ‘manufacturing’ cost of producing core strategies, in particular the commissioning of the evidence base. I noticed the other day that they are considering letting planning authorities set their own fees in Scotland, and couldn’t help thinking about how an authority would work out what it would charge. I am now wondering if we may have accidently created the basis of a pricing system should a similar change happen here…? One to think about.
Lesson 6 – Your staff are ok about time recording and you need to sort your systems data out.
The thing we thought would be difficult to make fly – staff time recording each day for a month – worked relatively easily. The bit we thought would be easy – getting good quality data from IT systems – wasn’t. With time recording, an inclusive, open and honest approach has seen staff respond positively. With the systems data, we underestimated how difficult it is for authorities to access raw data, and that it contains many holes.
Joint venture with CIPFA We’re starting a planning benchmarking club with CIPFA and are encouraging people to join. As we start sharing more and more detail and the data behind our findings we’ll be doing so in structured way through the benchmarking club. We want every council in the country involved in this. Some of what you’ll learn will be tough to take, but this is the kind of medicine that’s needed if quality planning services are going to survive.
To end this post, I want to return to some of my opening points. The best thing about working on this project (and what authorities seem to like about it) is that it has no ‘agenda’. It is not trying to ‘prove’ anything or endorse one way of running a service over another. It is due to this approach that we are continually being surprised by the directions it is taking us and what it is showing us. I think that this will be the same for authorities that take part – find out what you cost, what customer’s are really getting for their money, then with the help of some peer authorities decide what you want to change. Oh, and be prepared to be surprised (and possibly frightened).