A few weeks ago GDS hosted the latest training session for Digital by Default (DbD) assessors. There were over 50 attendees from across government - the largest assessor training session we have run so far.
This training is designed to explain why the Service Standard is important to digital services, and share the experience of assessing services against it.
The Service Standard team started training people in January. The first attendees were nominated by Digital Leaders. Some had already run assessments and sent through reports to certify that services (those with less than 100,000 transaction a year) are Digital by Default.
The training lasted about 3 hours with a mixture of presentations, a Q&A panel made up of GDS assessors, and mini-assessment exercises.
The session opened with a welcome presentation from Richard Sargeant, Director of the Performance and Delivery Unit at GDS (the area where the Service Standard reports into). Richard talked about the importance of changing the approach to digital in government, and the need to find a new way of delivering services. He explained that the DbD standard is not just part of the Digital Strategy, but a way to guide departments and agencies to build digital services that people prefer to use.
I talked through examples of standards outside government, such as the Apple App Store guidance, and consistent but localised Google branding around the world.
On the walls around the room were reports written by GDS assessors without the result written on them. During the break, everyone had an opportunity to read them and decide if the service had been successful or not at an assessment. The trainees were surprised by the directness of the reports, and the breadth of the services government offers.
It wasn’t possible to answer every question as time was short, but it was great to share ideas and knowledge. This sort of training helps give new assessors a feel for the process at GDS, and examples of the types of assessments happening across departments.
A lot of questions were about the depth of questioning in assessments, and the panel agreed that while it does depend on the service, most like to cover all areas of the standard and drill down into areas where they have concerns or see potential weakness.
One questioner asked “What makes a really good service?”. The panel all agreed that a focus on the users, coupled with an approach that welcomes change, will always create a strong foundation.
The mini-assessment exercise gave the assessors the opportunity to practice questioning each other and develop the skills they need to run an internal assessment of services.
While the scenarios were not to do with government (Lego and Doctor Who), the groups had to question each other like they would in an assessment, taking notes and focusing on the areas of the DbD standard such as User Needs or Security and Privacy. Everyone took part, and took the opportunity to meet new people from across government who will be sharing the role.
The feedback so far has been positive with lots of comments on the open style of training and the understanding this is new across government:
I particularly liked the openness, and acknowledgement that we are all learning and improving as we go through this.
I appreciated contact with GDS's learning culture.
At the end of the session we asked everyone what they would find useful to support them in their role as assessors. So we have agreed to get together all the assessors we have trained so far some time at the end of the year as a conference and learn from each others experiences.
I’ve been following up after all the training sessions using Survey Monkey to ask some straightforward questions and get some more detailed feedback. The good news is that everyone who fed back is confident they are equipped to run or be part of an assessment of services now.
Some of the suggested changes included giving time to what a good service looks like and more time in the training exercise (and more help with the scenarios). The clear favourite section of the training was the Q and A session with a panel of GDS assessors.
Here are the slides from the day