Skip to main content

Using A/B testing to make things better

Posted by: , Posted on: - Categories: GOV.UK

Our user support analysts have used A/B testing to help 1,000 more users a month find the contact details they're looking for on GOV.UK.

Find out how ...

Follow Ricky on Twitter, and don't forget to sign up for email alerts.


My name's Ricky Morris. I've been looking at how we can improve our processes around feedback on GOV.UK. We noticed that when people were getting in touch with us, they were asking for contact information that was already available on other pages. When people come to GOV.UK they may not know the exact name of the department that they want to get in touch with, so we had an idea about putting some of that contact information up so that people could get there easier and quicker. We ran an A/B test where we compared one design - a new design - to the old design. The old contact page had 5 links on it with the most popular contact pages. We added 20 more, and what we discovered was that for the new design, of 50,000 people a week who come to the contact page, we were helping nearly a thousand of them get straight to the contact information they needed. Running A/B tests is a great way of getting very robust data with very measurable evidence of how we're making an impact.

Sharing and comments

Share this page


  1. Comment by Ricky posted on

    Hi Rohan,

    thanks for the thoughtful comment. I wholeheartedly agree that it would be great to be able to provide support to our colleagues across government to help them optimise their designs and processes. When we looked into it, we felt there was not enough capacity for GDS to roll out full-scale support for controlled experiments.
    There is quite a high barrier to being able to perform MVT tests successfully, especially in terms of avoiding making incorrect conclusions based on misunderstandings of the data, or experiment design errors. Therefore we haven't made strong recommendations in the design manual, although we are working towards adding guidance to encourage testing.
    However, we have included MVT functionality in the open source frontend toolkit, which will help departments wishing to get started on their own journey with running experiments (if they are using Google Analytics):

  2. Comment by Sam posted on

    Oh how I wish you could split-test print advertising!

    • Replies to Sam>

      Comment by Rohan posted on

      I found this a really interesting case study.

      I wonder if the current GDS design manual doesn't do enough to promote A/B and multivariate testing? There is very little readily apparent material - one link to a Wired article, one link to wikipedia and one video which doesn't get to any much information on AB testing . It would be nice to see text in the main sections of the document along with additional links (e.g. ).

      Our current experience is that user centred design in Government is that AB testing and robust analytical approaches are adopted infrequently and are not a priority for digital leaders under pressure to deliver. This seems a shame. In large transactional areas A/B testing can answer questions on fraud and error, non-digital channel contact costs as well as handle conversation/sign up/response issues. Our experience running large successful multivariate tests (e.g. on outbound emails) in a department is that those outside the core 'digital' (and I'd argue the GDS manual influenced) sphere are currently more receptive to using these type of approaches. This is slightly counter intuitive so I wonder if members of GDS come across these issues? If not changes to the manual I'd be interested in GDS views on building support for A/B testing amongst digital leaders in Government?