Skip to main content

Agile testing at the Home Office

Posted by: , Posted on: - Categories: Transformation

More and more teams across government are getting involved in the transformation programme. We asked Mat Costick to write about his experiences working on an Agile project.

I'm the portfolio test manager for several projects at the Home Office. We’ve recently started to do our testing using Agile techniques; a big change for the team, but one that’s gone very well so far.

A bit of background

TDCS is a shared service within the Home Office; it stands for Test, Design and Consultancy Services. We support projects in the Home Office and other government departments like the Ministry of Justice and DEFRA.

We've always used waterfall as our test methodology for our major programmes with TDCS being responsible for user acceptance tests, system integration tests and assurance tests. As part of the assurance role we’d get involved in writing contract requirements for testing and then assure suppliers’ capability to do that testing. Later we’d review what suppliers were doing; test plans, coverage, scripts, data, that type of thing... policing the quality of what suppliers said they would do.

Typically, there’d be a huge gap between the work that we’d do up front and the actual acceptance testing of the service, and there are a few downfalls that go with that - things like requirements changing by the time we get to testing, or managing huge amounts of change requests. In turn, that usually means spending lots of money, lots of extra testing, more regression for the supplier - who has to go straight through the waterfall model again.

We do it pretty well, but it's a very laborious and long-winded way of getting there and by the time we have got there things have usually moved on in terms of what the business needs.

The visit visa application is currently in beta. We completed the alpha back in July, which was the first time we’d done Agile testing for the Home Office on a major programme. It's been quite a big cultural change. It’s been worth it though.

Kim has written about some of the specifics, but two things stick out for me.

Test with your users

The amount and quality of testing with users was revolutionary from our perspective; we've never really seen it to that level with customer research and experience key to everything we did. Chris and Katy shared a bit of their thinking about that in a blog post recently.

We were lucky; we had very good people both in the supplier and GDS teams. The way they did it and all the feedback that came out of it was superb. That's what we're trying to write down now to share with other projects; these are the methods, this is what you need to consider, these are the resources you should think about and put into your requirements for suppliers to provide.

Automate as much as possible

The other thing, I think, that was key from a test perspective to learn was the role of using automation (as much as possible).

We need a different technical skill set in addition to the Test Analyst role which carries out user acceptance testing, integration and exploratory testing. To get real benefit, we want to automate everything we can. We didn’t have those skills in the alpha but we now do in the beta, and it has made a marked difference already.

Agile doesn’t solve every problem

There are still hurdles that we're trying to get through. Testing with legacy systems can be very difficult. We’ve got no problems accessing front end services to test them, but it’s more complex with something like the biometric matching systems that are on dedicated restricted networks.

We need to book slots in advance - multiple Home Office projects might be using some of them - and they don’t lend themselves to running in a virtual environment. That means you can’t work as fast as you want to get those suppliers to deliver the fixes in unison. Ideally, you want both teams to be Agile. Instead you effectively start running semi-iterative waterfall methods.

We need that balance of being able to continuously integrate within what we're delivering while recognising that to do the full end-to-end testing against the old systems we need to build that into our estimates a little bit more.

That legacy state will, obviously, change over the next two or three years, but at the moment we're not there.

There’s much more to come

At first, it really was hard to let go of traditional methods of, "I need a full design spec; I need a full test plan." But once the team took that leap of faith they wouldn't go back. The biggest problem for us is changing that mind-set to work differently; it really is difficult. But once they do it, it's a much better, rewarding way of working for them.

It's not the answer to everything but there are huge gains in terms of failing fast and changing quick and delivering working solutions early that work and can be built upon using constant business feedback. It's been a huge step for us in terms of getting engaged in Agile teams, but it's like a train now; it's really difficult to stop and we don't want to get off!

Sharing and comments

Share this page


  1. Comment by Brett Brown posted on

    Hi Mat. I'm working on the Single Tier Programme which is being delieverd
    using Agile methodologies. Could I ask if you have produced any test approach / agile strategies that may help us inform ST test plans?

    • Replies to Brett Brown>

      Comment by Mark Winspear posted on

      Hi Brett - I'm leading the testing at the Skills Funding Agency and produced an Agile Test Framework in place of the existing 100 page Test Srategy. I realise this post is a little old now, however, I would be interested to see how you solved this and perhaps whether we could learn anything from each approach?

  2. Comment by Laurie Monk posted on

    It's a bit light on information. Do you have more detail on the approach, the tools etc.