I was asked to speak to one of our regional groups (HIOWLGA – don’t ask me to pronounce it) last week. It’s a group of leaders and CExs and they wanted me to take them through the implications of the new NPPF / NPPG.
There is nothing like the polite attention of a room full of clever people to force a bit of introspection and reflection. What do I really think they should know about the NPPF ? And how important is it all really when compared to the intense gloom and doom over finances ?
I don’t think it’s good enough for the proper website, but I’ll pop it here in case anyone else finds something in it to reuse. One of the CEx told me afterwards that he had already been briefed on the NPPF, but compared to how I’d approached it there had been a veil of gauze over the issues. I think it was a compliment – at least that is how I took it.
Sorry there aren’t any speakers notes – the slides are fairly self-explanatory and I just took people through it in my customary manner. Lots of questions but I don’t think my answers were good enough to be worth setting out here.
Latest performance statistics are out (20/09/2018)
The latest stats were published yesterday MHCLG. Only one more quarter to go, and that is almost over (to end of September).
PAS Support (1) – get ahead of the game – avoiding designation
PAS’ job on Designation is to help councils with how to recognise, measure and most importantly how to avoid it (I wrote a blog last year on that very matter – it’s still relevant just move the dates referred to on one year).
I’ve updated the PAS ‘Crystal Balls’; one for speed, one for quality. They let you measure performance in as close to real time as you care to feed them your most up to date performance data. I’d encourage ALL councils to use the crystal balls as part of their performance management framework – don’t get caught out – poor performance often ‘creeps up’ and over a 2 year reporting period just a couple of poor quarters can really drag overall performance down leaving little time to recover.
PAS Support (2) – get ahead of the game – how best to respond if you’re in danger
If you are a head of planning I will most likely have written to you earlier this year if I had concerns about your performance, offering a menu of support including FREE help to:
• re-check your PS1/2 data submissions (you’ll not be surprised to learn that most I check have errors)
• review DM process efficiency if you are failing on the Speed measure
• review committee operation and decision making if you are failing on the Quality measure.
The 2 year reporting period means new councils can appear on my radar each time the government publishes the latest statistics – so keep an eye on your inbox in the next week.
Quality of decisions assessment – warrants a special mention
The Quality measure is still fairly new and it has a number of confusing elements to how it functions and the assessment period it uses (which is not the same as the speed measures). Lots of us are still not 100% clear on how it operates – you’re not alone – have a look at a blog I did attempting to explain the quality measure here and how one council I was impressed with tries to stay ahead of things here.
Have a look at/use the Crystal Balls and let me know how you get on/what you think.
UPDATE – SEE COMMENT FOR IMPORTANT NOTE ON CAPPING
Ever since the Housing Delivery Test (HDT) was mooted in the “right homes in the right places” Housing White Paper we have been trying to get our heads round what is means in practice.
We have been nibbling away at this with pilot councils of various shapes and sizes for almost a year and we are a few weeks away from publishing our thoughts on how councils should respond to this new requirement. In advance of the “proper” versions here is what I have learned along the way.
Is the Housing Delivery Test a good thing ?
Less than a fortnight ago I began working with Marc Dorfman, Special Planning Advisor, at South Somerset District Council (SSDC), helping him get underneath the government’s quality designation measure. SSDC have reacted at lightening pace to improve performance; they have quickly tweaked procedures and set up a measuring and monitoring system – it’s worth sharing as a good example of how to get members and officers pulling together on what is difficult subject. Marc said:
“The MHCLG Indicator on “Major Appeals Quality” is having an impact. The 10% “poor performer designation threshold” is very low. With many authorities receiving and deciding some 50-150 majors over two years, only 15 lost major appeals over two years would put the LPA into the designation zone.
Understanding the measure is not straightforward
South Somerset over the first designation period 2015-17 came out at 9.3% just below the designation level, when the national average was 2.5%. SSDC recognised they needed to do something. Supported by PAS on “short cut calculations” and on detailed advice on how to calculate the 2016-18 and 2017-19 designation periods (there’s a 2 year assessment period and a ‘9 month time lag’ (see my last blog) to get your heads around), SSDC decided to put in place an action plan:
SSDC Action Plan
1. monthly monitoring and using the PAS Designation Quality ‘Crystal Ball’ toolkit
2. workshops with all councillors and planning committee members to look at the detail statistics, at key decisions and a review and reminder session for councillors about their own code of practice for planning
3. A new procedural step – Area Planning Committees cannot refuse major applications, they can only make recommendation to a Strategic Cttee, to allow a “time – out” for further negotiation and consideration.
SSDC are getting officers and members to work together to address the “slight increase” in major appeals allowed. “Slight”, because it only takes a few extra losses over 2 years for designation to be threatened.
Going forward, SSDC say they would like the threshold increased; the LPA is working hard to deliver its annual 5 year land supply target, (currently delivering 600 as opposed to a target of 725), but now it has to deal with the 20% buffer and the major appeals indicator. SSDC wonders why Govt is happy to punish a rural LPA to such a degree when it is managing to deliver over 80% of its annual target?
Last year the government got serious about the ‘Quality’ of decision making as part of the Designation regime. They are still mulling over whether to designate any councils, and, while the jury is out for last year, the regime continues and we’ve already reached the end of the two-year assessment period (April 2016 – March 2018) for this year’s round.
I’m regularly asked how the performance calculation works – like most things to do with designation, no one really bothers to understand it until they get in trouble and that’s usually too late. So, please read this previous blog on staying ahead of designation, and read on for a beter understanding of the quality performance measure.
Quality of decision making – how is my council’s performance measured?
The ‘Quality’ performance criteria is the total number of appeals lost divided by the total number of applications decided (so it’s a percentage of overall decisions, NB not total number of appeals divided by lost appeals).
This is how it operates:
- The current period under assessment is 1st April 2016 to 31st March 2018.
- Within this period, MHCLG count up all of the applications decided during the assessment period, plus any appeals for non-determination during this period. This is one part of the sum; this is the ‘total number of applications’. MHCLG will wait until December 2018 before getting out their calculators – this allows a 9 month ‘lag’ for appeals to go through the process/system.
- The second part of the sum is the number of appeals made on those applications that were decided during the assessment period. The number of overturns relating to these appeals is then viewed as a percentage of all decisions made in the period. A mocked up example:
- Total number of decisions made in designation period: 100 (all decided applications, and non determined applications that were appealed)
- Appeals made on these decisions: 20 (appeals made and decided 9 months after 31st March 2018 – the ‘lag’ period)
- Appeals overturned: 11 (out of the 20)
- 11/100 expressed as a %age – designation performance 11%. This is above the 10% performance threshold so you’d be in trouble.
The number of appeals made/allowed doesn’t really come into the equation – it is the number overturned as a percentage of the overall number of decisions.
How to manage quality performance
Because of this 9 month time lag, you need to start looking NOW at what might happen come December 2018 (when CLG do their sums). Here’s what to do:
- You know how many majors* you’ve decided in the designation assessment period (ref. your PS1/2 returns)
- You know how many cases you’ve refused so you should have an idea which of these refusals are likely to be appealed (and if so which ones you are confident of winning)
- You also know current appeals going through and should have a feel for whether they are likely to be overturned or not.
- By putting all of these things together you should have a feel for what your risk is come December 2018.
In many ways it’s already too late to effect performance this year, BUT ITS NOT TOO LATE to understand where you’ll end up. If you are in trouble, government will expect you to have a good response prepared about why performance looks like it does and what measures you’ve put in place to address things – the sooner the better. MHCLG fund PAS to support councils in trouble so get in touch with me (firstname.lastname@example.org) if you have concerns and we can assess the risk and begin helping you to get a strong case together should the worst happen. Even if you are ok this time, it’s not too late to start preparing for the 2017 – 2019 assessment period using the approach outlined above.
*The quality indicator works the same for majors and non majors – but no one gets close on non majors because of the massive numbers of decisions.
As has become standard PAS supported a team of officials from MHCLG presenting a series of roadshows around the country throughout April and May 2018. We did 10 events, and spoke with about 450 people from 204 councils. This sort of gig provides plenty of sparky people with interesting points of view, as well as protracted periods of time sitting on trains reflecting on how it all stacks up for local government.
So, what follows are my own personal thoughts – and remember that all of this is a consultation at the moment so it very much represents this moment in time. And, of course, the issues that tend to come out in debate and discussion are the difficult, awkward ones which will make this list sound a bit negative. Those that know me will tell you I am a little ray of sunshine and I find negativity very difficult. Nontheless, alongside the standard themes of “who is going to pay for all this ?” and “everyone is too busy to think about and deliver this change in a sensible way” are five big thoughts: Continue reading
I’ve been on a little voyage with the GDPR. Originally I argued that we needed to do a quick “heads up” on the key points for planners. There was (to be honest) a little bit of humming and aahing about whether planning was “special” enough to deserve something sector-specific, but then in the end it was agreed that we were. Just something “quick and dirty”, so off I went.
Thanks to lots of planners who asked me questions, thanks to the ICO and MHCLG, thanks to Umbreen and the ALBPO TS group, and thanks to Cheshire West & Chester we will be making something public in the next week or so that I hope will be a first step towards a practitioners guide. We can then evolve it as questions get addressed and we make joint decisions about how to behave in the grey areas.
For now, though, and in advance of the official, signed-off version I thought I’d give you my own thoughts on all this. Continue reading