As mentioned in other blogs, one of the first and primary tasks for the Alphagov project was to build a site that provided the information and services users need. So where were we going for the evidence? Within the timescale and resources of the project we saw value in taking a quantitative approach - looking at analytics and particularly search analytics across the central government web estate.
Writing this, I’m aware of Leisa Reichelt’s blog on UX and Alphagov, but here I’m going to talk about the learnings we got from analytics about what people are doing. Of course, this should be informed, qualified and enriched by qualitative studies to understand intent.
How do visitors get to central government information?
We looked at where users were coming from when arriving at central government websites. Hitwise provides information on the which referring (upstream) properties send traffic to a site. Unfortunately, it only reports on the top twenty referrers. Grouping these together, we see that search is the main driver:
|Upstream Websites visited before Government – Jan 2011|
|All govt sites||Central govt sites|
|Ttl of top 20||62.40%||62.50%|
Others, include BBC, Wikipedia
|Upstream Websites visited before 2 major departments – Jan 2011|
|Ttl of top 20||73.79%||71.69%|
Others, include BBC, Wikipedia
We also had access to a variety of site metrics, and here are some figures for Directgov:
|Upstream Websites visited before Directgov – Jan 2011|
|Affiliate (govt sites & others partners)||13.60%|
|Organic referrer (other links)||18.90%|
So, of course it's vital to provide other routes to information, but evidence points to the important of Google, Bing and social websites as the starting point of people's access to government information. It's therefore useful to review what people are searching for.
Users' information needs expressed through search
To identify candidate user needs for Alphagov we reviewed data from a variety of sources:
- Referring terms from web search engines to central government websites
- Referring terms from web search engines to Directgov, businesslink.gov.uk and a number of Government Departments
- Terms enter in site search for Directgov, businesslink.gov.uk and a number of Government Departments
As a fan of Louis Rosenfeld's approach to using search analytics and the attraction for me is that search logs provide a direct expression of many, many users' information needs. The data are usually readily available and the main cost is to interpret the data.
Firstly, some characteristics of search data:
- The average length of a search phrase is around three words. Users are trying to express a wealth of complex needs and intent in a short phrase.
- Search results show a classic Zipf curve, with a short head (relatively few terms searched for a lot); and a long tail (very many terms searched for by only one or two users).
Just looking at the short head tells you the most important search tasks, but at a simplistic level. But much richer data starts to appear in the 'middle torso', where you start to see not only new phrases but added nuances or facets of the top phrases, which can give a clue to what the intent expressed in three short words might really mean. For example, Council Tax is a popular search term, but digging deeper into the long tail, a number of facets being apparent:
- council tax bands/council tax rates
- council tax exemptions
- how much is council tax?
- pay council tax/ pay council tax online
- council tax benefit
- council tax exemptions
- council house discounts
- council tax moving house
- Registering for Council Tax
These are just the main concepts – that need distinct answers, but there are many more variations of phrases. So our approach was to go deeper into the middle and build up a series of top concepts, with important variations that probably need a separate answer. Going back to Council Tax, here's some of the landing pages on Alphagov:
- How much are Council tax bands?
- Apply for a Council Tax band reduction
- Pay Council Tax
- Check your Council Tax balance
This data was then reviewed against top landing page data from Directgov, businesslink.gov.uk, the Local Directgov service and a number of departmental sites; and by talking to stakeholders about business priorities to establish a list of top 'tasks'.
Now, tasks could be a range of things:
- Completing a transaction (renew car tax)
- Using an online tool or look-up (calculate holiday entitlement)
- Finding a quick answer (when is the next bank holiday?)
- Find out some information - to orientate, find out more (child tax credit). This could be provided by a guide or even a disambiguation page.
And people's intent changes according to where they are on their search journey. As people progress along the journey their search terms tend to be more specific and more focused:
|Task||Type of search term||Example|
|Compare resources||narrower||Diabetes symptoms|
|Compare options||narrower||Managing diabetes|
|Locational||With a location||Pharmacists in SE1|
So, from our perspective, this data gave a strong steer to the approach Richard Pope outlined in his earlier blog, It’s all about the nodes and what lives at them.
It's the landing page or node which is important; whether people are coming from web search, site search, deep links or social media. Increasingly, in future, people will be consuming content and tools elsewhere, through syndication and APIs. Where they land up needs to make sense as a granular, stand alone item; with a strong 'scent of information' from the link to the answer.
Signposting to related information and services is of course very important. People may need to know more, or it may be to their benefit to be pointed to additional information. I hope future iterations will experiment with leveraging metadata and concepts of relatedness to enhance this experience.
Many thanks to colleagues for contributing to this analytics work, especially Helen Lippell, who provided valuable insight based on user-centred taxonomy development.