How to define a good planning service

We’ve been chipping away at what good looks like for years at PAS. Planning is uniquely difficult – not only do we have the same problems as GPs (sometimes ‘success’ is something not happening) but when it does we’re often mediating between people who want opposing things.

We keep on at it because it’s important. And a definition of “good planning” doesn’t have to perfect to be useful. It’s clear that there is no single measure, so in advance of Martin’s work with a bunch of councils on this very topic here are my thoughts. For no other reason than someone asked me this morning.

I’m going to resist the ‘dashboard’ metaphor, because I sense that a dashboard has warning lights on it, and in the absence of warnings we can just tick along. Instead I’m going with the old favourite “continuous improvement” – and I’m going to measure two things. I’m going to try to get some opinions, and I’m going to measure some facts. And in this post I’m going to concentrate on the applications process only.

Perspectives on performance

There are three perspectives that are primary. I’d restrict to these three and I’d spend time and energy getting the quality of the data right. There is no ranking suggested in the order I list them – they are all valid and one should not be used to trump another.

Perspective 1 : The applicant. Don’t cavil – this is your customer. We’ve learnt alot through our postcard surveys of applicants in the past. From my experience I think we need to improve in two areas. You just have to get participation levels up. Make it easy (a swipeable QR code on the decision notice ?). Make it worthwhile (some kind of incentive – free building control ?). And link the feedback you get to the sort of application it was – so you understand which bits of the service this particular applicant experienced.

Perspective 2: The ward councillors. All of them. They should see all sides of the developmental process, and are uniquely placed with a perspective on the political nature of planning. They are as close as you’ll find to a local service user champion. To prevent it being a plebiscite on current applications I’d start by showing each ward member a list of the significant applications in their area in the previous 6 months.

Perspective 3: Your staff. They understand and want to do a good job. Does your organisation encourage and allow them to do so ? Probably more useful than turnover or sickness figures is to ask them whether they feel organised to deliver a good service.

Indices of performance

There are about a squillion things that can be measured. Trying to reduce these things down to an absolute minimum I’d start with just three areas: how well we hit targets, how much waste we tolerate and how much it all costs. And I suppose it is just because it’s annoyingly preventable that I’d have several ‘waste’ indicators.

Most of these are pretty straightforward and don’t need much explanation:

Repeat applications: often (but not always) following a withdrawn application. No fee. Repeat consultation. Annoying for everyone.

Reworked applications: usually at the initial stages of validation. Every time an application is picked up, assessed as wanting, and flipped back to the agent it costs time and money.

Appeals: appeals represent a failure, sometimes because of irrational behaviour on the part of an applicant it’s true. But they are definitely a bad thing, even when you “win”. And they are a sound indicator of poor quality service when you lose.

Success: We can’t just have failure indicators. How often do you do what you say you’ll do – which is issue a decision on time ? Just because I whinge about the national indicators almost continually does not mean that time is unimportant.

Cost: Part of doing things well is to do things slickly, and that means applying resources where they’ll make the most difference. No service can afford to be cost-blind. And I would want to know how much of the Council’s own money went into delivering a planning service. And to prevent it bumping around too much I’d ignore majors. This is about the routine 90%.

Conclusion

So, if I were setting out to improve my DC department, I’d use 3 opinions and 5 facts:

  • Applicant happiness (did we do a good job for you ?)
  • Ward Councillor happiness (did we do a good job for the place?)
  • Staff happiness (did we help you do a good job ?)
  • How much waste on repeat applications ?
  • How much waste on reworked applications ?
  • How much waste on appeals ? Poor quality on lost appeals ?
  • How many applications hit their target ? How many have missed ? How many are going to miss ?
  • How much extra did the council have to pay per application to make this happen ?

And then I’d get a group together to think about how to improve each one of these eight indicators. And I’d encourage them to think about how changes in one area might have a knock on effect in another. But I’d trust them to get on and make whatever changes were necessary. And in 6 months time I’d measure again and see whether my service was improving or getting worse. That, my friends, is the way continuous improvement rolls.

One thought on “How to define a good planning service

  1. Pingback: Re-thinking Continuous Improvement | Planning Advisory Service

Leave a comment