I was hosting one of our events yesterday in Birmingham. As we got closer to the end of the day, and people started shuffling and thinking about their trains, I politely but firmly reminded everyone that they were not allowed to escape without submitting their feedback form to me.
Feedback forms surround us. Sandwiches, hotel rooms, taxi rides – I probably have the opportunity to comment on a service or product almost every time I use one. On reflection, rather than threatening to imprison people, it seems to me that people would more gladly give us their opinion if they had confidence in our ability to notice what they said. To continue our series of “behind the scenes”, this is the story of what happens to your feedback forms.
There are, of course, different types of evaluation. As soon as I’d fought the good burghers of Birmingham for a seat on the train home I did the first scan of the forms. This first cut delivers two things – an initial feel for satisfaction and a chance to read the delegates comments. It is this run through that dictates whether I drown my sorrows or toast my victory with something from the buffet cart.
However, we often events in series and using a number of different consultants. We require a way of bringing all the feedback together in a way that allows us to learn from what happened – both the general lessons from working for our sector and the specific elements in case we want to repeat the series or run something similar in the future. This is what we call the “wash up”. We ran a series of LDF workshops at the beginning of the year – this is what the washing up looked like when all the plates had been cleared away.
You may need to fullscreen the presentation above to be able to read it. What follows is the same kind of stream of consciousness that you’d hear if we were in the same room. Hopefully it’s broadly clear which slide I’m talking about.
In case you don’t know PAS, we offer almost all of our services free – as in “free at the point of delivery”. The nominal cost of £188/head excludes staff costs, and given that most of our speakers also offer their services free it is a serious under-estimate of the true figure. The response rate of 329/453 is actually pretty good in comparison to our peers. There are, it appears, some people who will not under any circumstances offer an opinion.
The next pair of slides are done for corporate requirements. We never, ever, argue about anything within the PAS team. This use of statistics is the nearest we have to a schism. A short aside:
We ask people for a mark from 1 to 4 for each section of the day. It’s 1 to 4 because if it were 1 to 5 everyone would pick ‘3’. We have corporate standards that require our feedback to match a certain percentage of “3s and 4s”, and it is quite high. A quick conditional format later and we have the RAG table you can see above. On the face of it, it looks like great data – visual, quick, relevant. It enrages me. What is actually happening is that we take all the interest out of the user response – we treat 1&2 and 3&4 as the only two types of stuff and lose all the benefit of the outliers. We all know that the reason we do it is to make ourselves look good, and we do. This represents the data that is usually used to report our activities externally. I think the most charitable thing I can say is that is at least consistently rubbish. Something did go slightly wobbly in London. My intro at Winchester (my first ever presentation for PAS) did not disguise my terror well enough.
We quickly move into more interesting things. Unsurprisingly for an LDF seminar, the majority of the people there were planners from the Local Planning Authority (LPA). This is useful feedback of the most basic but important sort – are the correct people getting to hear about our events and are they given enough freedom to attend ?
The gender and ethnicity on the next slide represents a work in progress for us. I suspect it does for many of the organisations that collect it. We can tell that the bulk of our audiences are white british, and that women begin by being equally represented but fall-off sharply at 40. It is a minor mystery what happens to them after that – the difference can’t solely be due to a differential in retirement ages. This section probably more than any others requires a bit of “so what?”, and an understanding of the make-up of our sector.
The map is our way of looking at our coverage (thanks to Zoe in LGIH for her ongoing enthusiasm and assistance). It is a planning tool – like anyone trying to achieve coverage over England we need to shuffle our venues around to try for best fit. This particular map exlcudes the event we held in Exeter, so it isn’t quite as barren in the SouthWest as is suggested. This and the next slide – demand for particular venues – are our way of planning regional events. Depressingly for an organisation that goes out of its way to try and avoid London-centrism, it does seem to be the case that London is actually quite straightforward for people to get to.
Slide 11 is where it gets really useful. We average (note – not count 3’s and 4’s) delegate feedback and slice it by a simplified role category. This is how we can get to grips with content. From this session we learnt:
- the small number of private sector delegates were happiest of all
- regional bodies (often but not exclusively reps from regional assemblies) were impressed
- county planners probably needed some content specifically to address their needs
- government agencies (highways, environment etc) were underwhelmed
Written comments are the most difficult kind of stuff to process. In terms of a summary, all we attempt to do is to pick up the recurring themes. For this series, it was the opportunity to catch up with old acquaintances and the information on strategic housing land availability assessments (SHLAA).
The penultimate slide is a representation of the feedback for each session across all 10 events. The first event is the “intro”, and the final is the “Q&A panel”. It’s clear that the number of marks reduces over the course of the day as people have to leave. The relative popularity of each section is not to be overestimated – some of these sessions are done because they are good for you, not popular – but there are some reasonably clear messages:
- The most popular session is probably POSe, closely followed by the ever-popular PINS
- The “most excellent” session was SHLAA
- The Q&A sessions are difficult to get right.
- the session on PPS12 was not well received, by and large
The final slide is some commentary from the event lead, mainly about improvements to our booking and admin processes.
So, now you know. We do look at the forms, and we do think about how we can learn from our mistakes. This is not the whole process, of course. In the new year there is a “refresh” planned of our presentation materials. Some of us are hoping we can push this process beyond a branding and template exercise and into some of the areas described by Garr Reynolds in his mostly great book. Our next team building exercise ? I vote for a TED-style condense down of our day-long seminar into sub 20 minutes.