Skip to main content

https://gds.blog.gov.uk/2016/11/30/building-better-assessments-for-digital-services/

Building better assessments for digital services

How we're building better assessments

We’ve blogged before on how the Standards Assurance team at GDS is working in collaboration with people from across government to improve the way we assure things.

One of the ways we assure new digital services is via the Digital Service Standard. We presented on this at the Digital Leaders meeting on 16 November 2016. I’m delighted to be able to share what we are are doing with service teams to make this process better.

The Digital Service Standard is integral to how we build good digital services in government. Since its introduction in 2014, our team has assessed over 270 services, not to mention the hundreds of internal assessments we’ve helped departments to run themselves.

We’ve seen a lot of best practice across government.

Our discovery

Earlier this year, we interviewed lots of people involved in the assessment process in our discovery. The overwhelming conclusion was that, while the Digital Service Standard has been fundamental to changing the way we deliver digital services, the process needs a revamp.

We’re not alone

Our findings are not unique.

In my old home of Australia, the Digital Transformation Agency is facing similar issues: how to get greater consistency across assessments, how to avoid teams spending a long time preparing for an assessment, and how to make meetings more open and constructive.

And it’s no surprise that our proposals reflect the approach that the Ministry of Justice has just started trialling. The models for GDS assessments (we assess all services with over 100,000 transactions) and internal department run assessments (for services with fewer than 100,000 transactions) are very similar.

People watching the presentation

The new approach

Our proposed new assurance model, which we’re testing now, is designed to meet the needs we identified during the discovery. This told us that teams need:

  • advice and support from GDS from the start
  • assessors to have a deeper understanding of the service they’re assessing
  • greater consistency in the advice and recommendations they receive

Under the new model, assessors will visit teams regularly at their location throughout the development of their service. A lead assessor will be assigned to each service and help the team to identify risks and any areas they need to focus on.

The lead assessor will bring along expert assessors – for example, on user research or service design. This will allow all assessors to gain a better understanding of the service and give immediate, less formal recommendations about progress towards meeting the Standard.

Assessors will have the option to observe team meetings, to see their physical boards and ask questions. They will have access to the whole team (not limited to the 5 members currently allowed to come to an assessment), see the team environment and the interactions with suppliers. Questions like, “Give me an example of how your team works in an agile way” will become redundant.

We’re exploring ways to record the outcomes of visits that will allow service teams to track how they are progressing towards meeting each of the Service Standard points.

While the engagement will be continuous, there will still be assurance that a service meets the Standard before it’s made public. When the service approaches the end of a stage and is ready to progress, for example, from alpha to private beta, the lead assessor will bring along a ‘buddy’ to peer-review the service team. This will ensure objectivity and consistency.

Challenges

We think there are 2 big challenges facing us. They are:

  • making this model sustainable – service teams are located throughout the UK and we currently have a limited number of trained assessors.
  • resolving differences of opinion – the new model requires an alternative approach to how we overcome difficult conversations and sticking points

We hope to get some ideas on how to address these from the trial.

What’s to come?

We’re in our alpha phase and are road-testing the new model with a handful of departments over the next couple of months. We still have a long way to go. We’re approaching the service assessment transformation as we would a digital service transformation. User needs will continue to be identified and validated, and we will continue testing and iterating our proposed assurance model.

This is probably a good time to say thank you to all our assessor panelists that have come along on the journey with us so far, not to mention all the service teams.

We’ll keep blogging about the process and if you would like to be involved in any future trials please email the Standards and Assurance Community.

Follow GDS on Twitter, and don’t forget to sign up for email alerts.

Sharing and comments

Share this page

1 comment

  1. Comment by Claudio Pires Franco posted on

    Very good move! Being on the ground, visiting offices regularly, will definitely be another big push towards making more (hopefully all) projects / services truly Agile, user-centred, as digital government is now meant to be.
    In closer collaboration with departments GDS can move from being seen as the 'digital Ofsted' or the 'Spanish Inquisition' to being a digital Yoda! Or a travel companion.
    Regional GDS 'offices' / representation is also on the radar, I've heard - it could help sustainability and avoid the perception of a London-centric effort.