Hi I’m Angela, I work in the User Research Team and would like to talk about the range of research we conducted in the run up to the launch of Inside Government.
Just to recap, on 15 November the GDS team successfully launched Inside Government, with DCLG and DfT. Those departments have since been joined by MOD, FCO, BIS and the AGO. The remaining 18 ministerial departments will be fully migrated over to the GOV.UK domain in early 2013, with over 300 other public bodies and organisations joining the site by March 2014 .
But let’s go back a few months...
Planning any programme of research starts with understanding the target audience and their user needs. We already had an idea about who the users were. Our insight was based on site analytics, industry data and our own surveys and interviews. Our 'holistic' research plan included quick, focussed feature testing, more formalised structured task-based evaluations and wide-scale quantitative insight.
The early days and the ‘agile approach’
To date, our team have reviewed both the beta and live versions of Inside Government involving over 500 external users through a combination of research methods.
Back in the spring, Nick Breeze conducted some task-based user testing on an early version of Inside Government. By the time I arrived in the summer, the designs had evolved. In October, I conducted a quick expert review on the latest designs and followed this up with some guerrilla testing with real end-users from Southend-on-Sea Borough Council.
Mark Hurrell, a designer from the Inside Government team, attended those sessions. He could see see and hear how people were interacting with the site first hand, and then weave the findings back into the designs shortly after. Mark talked about this in his tumblr post on 16th October.
This guerrilla approach to user testing works well. Sessions are informal, focussed on specific features and, with development teams so closely involved, there's very little need to report back.
Qualitative
We don't rely solely on any one method of user testing. In November we conducted formal lab-based user testing in London and Leeds. We did this to get a better understanding of how our audience was interacting with the site and to flag up any glaringly obvious issues that needed addressing before launch.
Individual facilitated sessions were conducted with 12 participants, all of whom regularly used the DfT and DCLG websites. Each session lasted an hour and covered specific tasks around; proposition, policy pages, organisation homepages, microcopy, navigation and layout.
In addition to the 12 lab-based sessions, a further 2 participants with visual and cognitive impairments reviewed the site with a facilitator. They used their own ‘assistive technology’. Another 9 participants accessed the site remotely and evaluated the site, without a facilitator, using a questionnaire.
Some of the most heartening feedback the team received was from the research agency conducting the sessions, who said that in their professional experience this was ‘the most positive user testing of any website they had ever seen’.
Overall the findings revealed that:
- the proposition of Inside Government was positively received
- there was a good understanding on who the site was aimed at
- the clean, uncluttered layout was appealing
- for most the global navigation supported a seamless journey across departmental content
However, the research also highlighted areas for improvement too, in particular:
- the type of content showcased on the department homepages (users wanted less 'PR')
- the ability to easily locate consultation documents
- the visibility of the links to departments' homepages
Neil Williams, Mark Hurrell and the Inside Government team have already made some changes to address all 3 of these issues. Improvements will continue in the New Year.
Quantitative
Once the site launched, the Inside Government team were able to assess levels of traffic, demand and engagement to the site.
We also conducted some remote usability testing, or what we call 'summative testing'. This allows us to gauge the performance of content, layout and perception with a large representative sample of internet users.
We selected 5 tasks that were not too dissimilar to those used in the lab testing. We measured success and time on completion, click streams and satisfaction. We collected the search terms used and recorded all comments made.
The key findings revealed that:
- 65% successfully found the answer they were looking for, which suggests that 35% did not. That's a significant learning and one which we're addressing
- the average time taken to complete each task was 2.1 minutes. This is a useful benchmark to measure future tasks against
Next steps
Inside Government is set to grow, as will our knowledge about our audience and how they use the site.
After the Christmas break we'll continue to do more guerrilla style testing, working closely with development teams and departments. We'll be conducting more formal, lab-based testing at the end of January and we've planned some summative testing for the Spring. Watch this space.
2 comments
Comment by Jon W posted on
> users wanted less ‘PR’
I whole-heartedly endorse this. I get fed up of ploughing through news releases and loads of items giving ministers' dubious spin about policy priorities when what I want is the actual guidance, statistics and facts I need to put to use. These usually appear well below the fold. Without a consistent topic based menu system it can be really hard to find what you want, even when you know it should be there. I am thinking especially of the DCLG content. Indeed on the day the Local Government Finance Setttlement figures were launched the only content on the gov.uk website was "PR" - there wasn't even a link (that i could find) to the actual figures which were still hosted on the old site.
I think the ambition is admirable but there is still a long way to go on usability.
Comment by Paul Marr posted on
Research must be done inside the government. Good article