It’s been a few months since we last posted here about the Service Manual, the best place to find guidance and tools for government service teams. Our biggest news since then is that the Service Manual passed its beta assessment in June. Thanks to the team for all of their hard work.
Here’s an an overview of what we’ve been up to since our last update.
- launched a publishing app with a new interface to allow us to update the Manual more easily
- published nearly 100 new or improved guides across 9 topics
- iterated and improved on GOV.UK’s design patterns based on our user research
- continued to do research with our users across government
- built a performance framework to check the success of the changes we’re making
Read on for a bit more detail.
Publishing app and front end
The main goal for the technology team was to build an accessible, easy-to-navigate front end, based on the GOV.UK platform.
Separating the Service Manual front-end application from the standard GOV.UK front end lets us develop new features quicker - useful for rapid prototyping and testing. These include an accordion (an expandable list) on the topic pages and the page history on guides.
Another goal was to launch a publishing app so that content designers could publish Service Manual guides using a secure and logical workflow. The minimum viable product (MVP) version of the app is now live and we’re already working on the next version. We’re also exploring how we can offer what we’ve built to other government publishers on GOV.UK.
New content we’ve published
We’ve been working with experts across government to update and reorganise the information in the Service Manual. This is so it reflects current best practice and so that we can learn from the expansion in digital skills and experience of people working on services across government.
So far, we’ve published fresh guides in these topic areas:
- agile delivery
- the team
- measuring success
- user research
- service assessments
- Digital Service Standard
- helping people to use your service
We’ve already started iterating and improving newly published content based on user feedback. We will continue to do this.
Design patterns and our design
As well as remaining consistent with existing GOV.UK design patterns, we’ve also designed new patterns. These include the accordions, page contents and page metadata mentioned above. They’re based on research with Service Manual users. They’ve been designed and built so they can be reused across the rest of GOV.UK where appropriate.
We’ve also designed an MVP interface and flow for the back-end publishing app. We’ll continue to improve this based on user research with content designers.
We’ve focused on 3 broad themes in our user research:
- exploring the relationship between the Manual, the Digital Service Standard and assessments
- defining what the Manual should cover
- validating our approach to content, which we do routinely, and especially before launching a new topic
Earlier this year we experimented with publishing the Digital Service Standard as a stand-alone publication on GOV.UK – we tried separating it from the guidance and service assessments information in the Service Manual. But when we tested this in the lab, we found it didn’t make sense to our users. So for now the Digital Service Standard remains a part of the Service Manual.
We continuously test how our approach to guidance works for users through content evaluations to ensure that it meets the needs we identified in our discovery.
We focus on:
- whether users understand what’s mandatory and what’s optional
- whether users find the Manual persuasive and useful
We’ve started exploring if the Service Manual’s scope should change. We’re considering how the Manual relates to all the GOV.UK resources (products, tools, patterns and components) that help teams design and build a service, and how service teams use these.
Based on earlier research, we’re looking at what the Service Manual should include and what it should link to. We’ll start testing with users soon.
We’ve built a performance framework to measure the effect of the changes we’re making.
The Manual is non-transactional. As it has no defined start points or end goals, it can be hard to measure how well it’s doing.
We’ve addressed this issue by choosing measures and key performance indicators that are based around user behaviour. For example, we’re using Google Analytics to check if users are:
- leaving the site on navigation pages
- finding what they need in search
We’ve also redesigned our online user survey, which sits at the top of every page. We designed this to work in conjunction with the analytics we gather and to improve the quality of information we get from users.
We’re using this combination of analytics and user feedback to inform changes in a number of areas, from how we display videos on the site to improving the user journey for on-page links.
Talk to us
We passed our beta assessment, but we’re not done yet.
In the next few months we plan to publish more content in technology, user research and design – including service, content and interaction design. We’ll keep improving the content we’ve already published. We’ll also keep exploring both the relationships between the Service Manual and any other tools our users need.
We’ll update you soon on what else we’ll be doing next. In the meantime, let us know about your experience using the Service Manual.
Get in touch with us through the cross-government or GDS Slack channels, or leave a comment below.