I’m Kit, I work for Department for Work and Pensions (DWP), and I recently became a lead assessor for GDS service standard assessments. Until a few months ago I was a service manager and have represented teams in around a dozen service standard assessments; this post is about what I’ve learned from sitting on the other side of the table.
I jumped at the chance to become an assessor for two reasons:
Firstly, I’m part of a cross-government digital community. I feel that I, and others in departments with experience of launching digital services, have a responsibility to put that experience to good use in that community.
Secondly (and this isn't about me specifically), it lends even more strength to the GDS assessment process that they now trust departmental digital leads to assess services.I want to celebrate and be part of that. It’s testament to how far digital has come across government that this now happens.
What I learned
Being an assessor needs preparation too
Preparing for a service standard assessment is a real undertaking for a service team. The time leading up to an assessment is like a multi-week retrospective - you look back and scrutinise every aspect of what you’ve done, finding potential gaps and making sure your service is everything it can be to your users at a particular stage. As an assessor you’ve got 4 hours to do justice and respect to that preparation, and get the best out of the team in front of you.
I understand now that preparation is as important for the assessment panel as for the service team, particularly around any perceived gaps and specific question points, to make sure that happens.
Show, don’t tell
Although 4 hours (often more) is a long time on paper, it really flies, and it’s the lead assessor’s job to make sure that time is productive. For me, the way to do that is to focus at least half the time on the service demo, and trust yourself and the panel to bring out the salient points from it without formal questions.
I then use the post-demo break to go through the standard points and then ask fewer, more specific questions from it to fill in the gaps in the panel’s knowledge.
Team environment says a lot
When assessing GOV.UK Notify recently, we as a panel went to see where the team sat, and looked at their working environment and wall space. How a team works is as important as what they work on, and seeing this for real made the answers come alive.
When assessing services in departments, a lead assessor can easily arrange a pre-meet or even ask for pictures of where the team sits. It’s all part of the bigger picture.
It’s as much about how you ask a question as what you ask
When representing teams at assessments, I hadn’t given much thought to how the questions were phrased. When I became an assessor I realised that the quality of the output was proportionate to the quality of the questioning, and I gave lots of time to make sure I was phrasing questions in a way to elicit clear and direct responses.
There’s training available through GDS which can help with this, that I recently attended, and found really useful.
Take care of each other
Finally, it seems a minor point but a service standard assessment is a big deal for everyone involved.To get the best output it’s important to keep energy levels up. There needs to be plenty of water and caffeine around as well as some food. It’s good to build in a couple of breaks to allow people to refocus, and keep the tone light and friendly. This quells nerves and allows people to perform at their best. It’s not an interrogation; it’s a collaborative effort.
In short, it’s been eye-opener and I look forward to assessing lots more services in the future. You can contact me @kitterati on Twitter with any comments – I’d love to hear from you.
Join the conversation on Twitter, and don't forget to sign up for email alerts.
Comment by Phil Buckley posted on
Hi Kit - really interesting, thanks for sharing.
I too have both assessed and been assessed, and totally agree that preparation from the panel is vital: project teams generally put in an enormous amount of work for an assessment, and a panel which has not returned that compliment risks wasting everyone’s time and potentially making bad decisions. On one panel - thankfully for a mock assessment - one of my fellow panel members hadn’t looked at the service in advance, and took up much valuable time asking questions which diverted us from going through the assessment points.
I also try to approach assessing like interviewing someone for a job: to try to give the project team the best possible chance to explain and justify their work. Deciding to fail a service can be necessary, but it is normally a very expensive decision for the project team and possibly for their users, who with a pass may get an imperfect service, but with a fail may get no service at all. I would like to be certain that we on the panel haven’t missed anything before coming to that conclusion.
Thanks again Kit -