Quality/Designation Measure (2) – South Somerset ‘get’ it

Less than a fortnight ago I began working with Marc Dorfman, Special Planning Advisor, at South Somerset District Council (SSDC),  helping him get underneath the government’s quality designation measure.  SSDC have reacted at lightening pace to improve performance; they have quickly tweaked procedures and set up a measuring and monitoring system – it’s worth sharing as a good example of how to get members and officers pulling together on what is difficult subject. Marc said:

“The MHCLG Indicator on “Major Appeals Quality” is having an impact. The 10% “poor performer designation threshold” is very low. With many authorities receiving and deciding some 50-150 majors over two years, only 15 lost major appeals over two years would put the LPA into the designation zone.

Understanding the measure is not straightforward

South Somerset over the first designation period 2015-17 came out at 9.3% just below the designation level, when the national average was 2.5%. SSDC recognised they needed to do something. Supported by PAS on “short cut calculations” and on detailed advice on how to calculate the 2016-18 and 2017-19 designation periods (there’s a 2 year assessment period and a ‘9 month time lag’ (see my last blog) to get your heads around), SSDC decided to put in place an action plan:

SSDC Action Plan

1. monthly monitoring and using the PAS Designation Quality ‘Crystal Ball’ toolkit 

2. workshops with all councillors and planning committee members to look at the detail statistics, at key decisions and a review and reminder session for councillors about their own code of practice for planning

3. A new procedural step – Area Planning Committees cannot refuse major applications, they can only make recommendation to a Strategic Cttee, to allow a “time – out” for further negotiation and consideration.

SSDC are getting officers and members to work together to address the “slight increase” in major appeals allowed. “Slight”, because it only takes a few extra losses over 2 years for designation to be threatened.

Going forward, SSDC say they would like the threshold increased; the LPA is working hard to deliver its annual 5 year land supply target, (currently delivering 600 as opposed to a target of 725), but now it has to deal with the 20% buffer and the major appeals indicator. SSDC wonders why Govt is happy to punish a rural LPA to such a degree when it is managing to deliver over 80% of its annual target?

 

Advertisements

Quality/Designation Measure (1) do you ‘get’ it?

Last year the government got serious about the ‘Quality’ of decision making as part of the Designation regime. They are still mulling over whether to designate any councils, and, while the jury is out for last year, the regime continues and we’ve already reached the end of the two-year assessment period (April 2016 – March 2018) for this year’s round.

I’m regularly asked how the performance calculation works – like most things to do with designation,  no one really bothers to understand it until they get in trouble and that’s usually too late. So, please read this previous blog on staying ahead of designation, and read on for a beter understanding of  the quality performance measure.

Quality of decision making – how is my council’s performance measured?

The ‘Quality’ performance criteria is the total number of appeals lost divided by the total number of applications decided (so it’s a percentage of overall decisions, NB not total number of appeals divided by lost appeals).

This is how it operates:

  1. The current period under assessment is 1st April 2016 to 31st March 2018.
  2. Within this period, MHCLG count up all of the applications decided during the assessment period, plus any appeals for non-determination during this period. This is one part of the sum; this is the ‘total number of applications’. MHCLG will wait until December 2018 before getting out their calculators – this allows a 9 month ‘lag’ for appeals to go through the process/system.
  3. The second part of the sum is the number of appeals made on those applications that were decided during the assessment period. The number of overturns relating to these appeals is then viewed as a percentage of all decisions made in the period. A mocked up example:
  • Total number of decisions made in designation period: 100 (all decided applications, and non determined applications that were appealed)
  • Appeals made on these decisions: 20 (appeals made and decided 9 months after 31st March 2018 – the ‘lag’ period)
  • Appeals overturned: 11 (out of the 20)
  • 11/100 expressed as a %age – designation performance 11%. This is above the 10% performance threshold so you’d be in trouble.

The number of appeals made/allowed doesn’t really come into the equation – it is the number overturned as a percentage of the overall number of decisions.

How to manage quality performance

Because of this 9 month time lag, you need to start looking NOW at what might happen come December 2018 (when CLG do their sums). Here’s what to do:

  • You know how many majors* you’ve decided in the designation assessment period (ref. your PS1/2 returns)
  • You know how many cases you’ve refused so you should have an idea which of these refusals are likely to be appealed (and if so which ones you are confident of winning)
  • You also know current appeals going through and should have a feel for whether they are likely to be overturned or not.
  • By putting all of these things together you should have a feel for what your risk is come December 2018.

In many ways it’s already too late to effect performance this year, BUT ITS NOT TOO LATE to understand where you’ll end up. If you are in trouble, government will expect you to have a good response prepared about why performance looks like it does and what measures you’ve put in place to address things – the sooner the better. MHCLG fund PAS to support councils in trouble so get in touch with me (martin.hutchings@local.gov.uk) if you have concerns and we can assess the risk and begin helping you to get a strong case together should the worst happen. Even if you are ok this time, it’s not too late to start preparing for the 2017 – 2019 assessment period using the approach outlined above.

*The quality indicator works the same for majors and non majors – but no one gets close on non majors because of the massive numbers of decisions.

Reflections on the 2018 NPPF

As has become standard PAS supported a team of officials from MHCLG presenting a series of roadshows around the country throughout April and May 2018. We did 10 events, and spoke with about 450 people from 204 councils. This sort of gig provides plenty of sparky people with interesting points of view, as well as protracted periods of time sitting on trains reflecting on how it all stacks up for local government.

28011823868_39a74ee667_z

So, what follows are my own personal thoughts – and remember that all of this is a consultation at the moment so it very much represents this moment in time. And, of course, the issues that tend to come out in debate and discussion are the difficult, awkward ones which will make this list sound a bit negative. Those that know me will tell you I am a little ray of sunshine and I find negativity very difficult. Nontheless,  alongside the standard themes of “who is going to pay for all this ?” and “everyone is too busy to think about and deliver this change in a sensible way” are five big thoughts: Continue reading

Planning and the GDPR

I’ve been on a little voyage with the GDPR. Originally I argued that we needed to do a quick “heads up” on the key points for planners. There was (to be honest) a little bit of humming and aahing about whether planning was “special” enough to deserve something sector-specific, but then in the end it was agreed that we were. Just something “quick and dirty”, so off I went.

Thanks to lots of planners who asked me questions, thanks to the ICO and MHCLG, thanks to Umbreen and the ALBPO TS group, and thanks to Cheshire West & Chester we will be making something public in the next week or so that I hope will be a first step towards a practitioners guide. We can then evolve it as questions get addressed and we make joint decisions about how to behave in the grey areas.

For now, though, and in advance of the official, signed-off version I thought I’d give you my own thoughts on all this.  Continue reading

Designation – get ahead and stay ahead

Don’t get caught out by designation – make the ‘PAS designation speed and quality crystal balls’ a permanent part of your performance management system. Get ahead by understanding on a quarter-by-quarter basis how your performance (e.g. speed of issuing decisions, quality of decision making) stacks up against the government’s performance measures. In December for the first time councils will be put on notice of designation for poor quality decision making – the two year period being assessed for this has already passed (April 2015-March 2017) so don’t wait until December; use our crystal ball to assess if you are at risk NOW.

PAS designation speed and quality crystal balls can be found on the knowledgehub here.

Published statistics are too old to be useful
Relying on the government’s published statistics only help you look backwards, are at least 3 months old when published, and, because they are look back over a rolling 2 year period, they don’t really help you understand how well you are performing within the period that you will finally be judged on. This means that for many councils not paying attention, it is already too late when they find out that they are under or close to the government’s designation thresholds.

Manage performance in ‘real time’
The PAS designation crystal ball allows you to measure your performance in as close to real time as you care to feed it your most up to date performance data. It helps show at any given time how much cushion you have / gap you need to make up between your performance and the designation thresholds.

We are encouraging ALL councils to use the crystal balls as part of their performance management framework – many councils get caught in the designation process because poor performance ‘creeps up’ and, because the reporting period is over 2 years, just a couple of poor quarters can really drag overall performance down and for some councils it leaves them little time to recover.

PAS focus
PAS uses the crystal balls on a national scale to keep an eye on how councils are performing and offering  improvement support. Our resources for doing this are finite and we can only ever get to those councils whose performance is dropping sharply and noticeably.

So, get ahead, use the crystal balls and tackle poor performance before it takes hold. Visit the khub download the toolkit and let us know how  useful you find them.

 

IR35 and agency Planners

[13th Sept – this post was edited to make it clear that the IR35 changes are in the past – it is the impact of the changes that are still being felt]

There are changes afoot. IR35 (also known as off-payroll) rules changed this year and it will affect people who have formed their own one-man bands to sell themselves to planning departments. The short version is that the employer now needs to decide whether a worker is “in” or “out” and there are tax implications (and potential back-tax implications) that flow from this.

This only the latest in a series of changes of this kind, and there is a broader debate about fairness that I’m not interested in right now. There is also the issue about overall Planning capacity that I’m not going to go into either. My point is – what will the impact of this change be on rates ?

Continue reading

Reflections on the Housing White Paper

We finished our event series on the HWP yesterday, and before the memory starts to fade I thought I’d try to set out my thoughts. This is a personal reflection rather than the FAQ (and member briefing) on the HWP that we have separately promised to make.

What the HWP means for Local Plans

Most obviously the HWP disrupts the local plan system in three ways Continue reading