Benchmarking – improving planning services

(this article is reproduced with the permission of the journal of the TCPA)

The Planning Advisory Service (PAS) is a government-funded programme of improvement for local authorities. Eighteen months ago we began a project for managers of planning services. This is the story of one authority, Hastings Borough Council, who were in our very first benchmarking group of six coastal authorities in the South East.

The most common performance measure published by planning authorities is National Indicator 157 (NI157), on speed of determining planning applications. Hastings’ performance against NI157 (centre top in the chart above) showed a slight decline over the last four years compared with that of its peers. Our work with this group was designed to get underneath and behind this sort of statistic to help unpick what was really going on and so help them to understand the implications for costs, time and performance.

A back-office view
Our project began by taking a copy of each planning department’s applications database and translating it into a common format. This allowed detailed comparisons between the authorities in the group on various parts of the process. It gave us evidence – realistic models for cost and time.
For example, if an authority wanted to know exactly when decisions were being made in the process and how much a particular activity cost, and then make comparisons with other authorities within the group – it could. This way, they could either ask themselves ‘Why are we so out of kilter with the others?’, or say ‘Actually, this looks fine’. One of the earliest observations was about decision-making. All the pilots showed what became known as ‘last-minute-itus’.

This plot shows decisions often being issued on day 56

This was eye-opening because decisions for all the authorities in the group – including those with improving NI157 performance – were becoming increasingly clustered around day 56. This has two consequences. The first is that efficiency is necessarily compromised – work is processed in the order it will go out of time, and not clustered to reduce site visit time. It also has important knock-on implications for applicants, as discussed below.
While jitterplots revealed the issue, it took another revision to the plots to turn them into more useful information. A box plot was required to show more clearly the trend over time:

The participants found this one of the most useful discoveries of the benchmarking process, and it was clear how poorly served they were by their existing performance monitoring systems. This also provided confirmation that some of the previous ‘improvements’ to the system were not actually making things better:

Our behaviour throughout every stage of the process was driven by the need to meet the targets. In my view the whole industry of pre-application discussions largely came about to exclude part of the planning process from the performance indicators.
Applicants don’t care whether we call it pre-app or the formal application stage: what they care about is the time it takes from first approaching us to them receiving their decision. Moreover, ‘pre-app’ worries local residents, who sense deals being done behind closed doors. If we step away from target-chasing we could negotiate as part of the formal planning process – it would be more transparent and from an applicant’s perspective it wouldn’t take any longer.’
Raymond Crawford, Head of Development Control, Hastings Borough Council

So what ?
Our work with the group used data and understanding of performance as a way to direct improvement. Once there was confidence in these benchmarks, the focus then shifted onto doing something to improve them:

Not many will have had data presented to them in this way, but if all we did was provide new insights into performance then we will have failed. The real value comes from getting councils together, helping them understand what the data is telling them, and then helping them get started on joint or individual improvement projects.’
Martin Hutchings, Project Manager, PAS

On recommendation, we worked with Barrie McKinnon from Salford City Council. Using a process improvement methodology called SPRINT, he ran a training course over two days for the group, followed by a one-day Design Studio some weeks later:

The purpose of the Design Studio is to explore the way people currently work and to identify where there are weaknesses or gaps in the provision – or simply where work can be done in a completely new, radically different way, typically making best use of technology. The Design Studio is usually employed by staff from a single organisation – so the group effort as seen in this instance is quite a new approach – allowing the radical innovative ideas of people from one place to be shared and shaped by people from another.’
Barrie McKinnon, Senior Business Analyst, Salford City Council

The Design Studio offered the opportunity for people to think, help each other and discuss ideas in a supportive environment. Many of changes suggested were not wildly innovative or game-changing, but the group agreed a number of improvements that they would test out in practice. Hastings agreed to test a triage system, with other members of the group each taking away other ideas:

‘I have always struggled to explain to a householder why it will take eight weeks for us to decide whether they can have their conservatory. It takes the same number of staff hours to deal with a simple application in eight weeks as it does in six weeks; most of the time the application is sitting in a queue waiting to get to the top of the pile. Caseworkers tend to deal with applications consecutively, and a simple application can easily get stuck behind a more complicated one. This led to our trial of dealing with simple applications through an ‘applications hotdesk’ and keeping them separate from other streams of work.
‘But there have been longer-term benefits: for me the experience has lead me to question every part of our process. This is only the first in a series of changes we’re going to make. And following our success, SPRINT have delivered training to other managers in Hastings outside the Planning Department – it has implications for many service areas now.
‘Lastly, successfully improving performance at such a challenging time – without additional resources – is something that we are very proud of, and it has been really good for team morale.’
Raymond Crawford, Head of Development Control, Hastings Borough Council

The above shows how the time taken to issue decisions is reducing – but there is something more powerful to come out of this. It is the way you can manage and use the expectations of applicants.
Not only have the Hastings team done a fantastic job of streamlining the process for straightforward applications, they’ve given themselves a great platform from which to improve further by nudging the users of their service in the right direction.
At the beginning of the year, each applicant could expect an answer on day 56 – whether they submitted nonsense or a beautifully complete case with every aspect nailed down. Hastings can now demonstrate that it deals with most applications between day 34 and day 50 – depending on how much work is required. Suddenly, there is some incentive for applicants to submit quality applications – they get a decision more quickly. This transparency about performance actually promotes a sensible conversation where applicants can see that they can influence the service they receive.

It has been an unexpected bonus that some tools we built to help managers understand their services might have broader use in sharing responsibility for performance between Hastings and their planning agents. I have also really enjoyed working with SPRINT, and have been heartened by how well the sector can help itself.
Of course, the project has also been very useful for our improvement work at PAS, because we can then build on the Hastings experience to further streamline planning. It is clear that we can make even sharper distinctions between the straightforward applications and significant, risky ones.
In my view, the planning sector has been sensitive about acknowledging that much of our work amounts to quick, cheap and straightforward certifications. There is a perception that this risks devaluing planning, and presents it as a regulatory ‘box ticking’ service. But actually it makes sense to acknowledge the range of work that goes into a planning function. It seems inevitable that we are going to see planners on-site, agreeing permissions or variations immediately simply by using their professional judgement. This alarms some people, but the truth is that many of our processes add cost without adding great value or democratic accountability.
Back to Hastings. Any meaningful improvement is a people story, and Raymond Crawford and all involved deserve great credit:

‘For me, ‘the Hastings experience’ is about giving people the space and time to make decisions. This comes not only from having good insight from your data, but also from having your peers challenge, endorse and (hopefully) copy your good ideas. “More for less” is not going away, and Hastings have a great platform to streamline planning still further.’
Martin Hutchings, Project Manager, PAS

[note R and ggplot have been used extensively through this project for their ability to quickly generate comparative plots of benchmark data]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s