What we learned from the National Library's first annual online satisfaction survey

Dear reader, there's quite a bit of detail here about the process of designing the survey that user experience practitioners might find useful. However, if you just want to get to the results feel free to skip to the heading, 'What we learned'.

Why we did a satisfaction survey?

The Library aims to be customer-centred in its approach to the design and ongoing improvement of our online services. The primary tools that we use are usability testing (usually in one-on-one observed sessions), listening to our customers (either in person or via feedback forms), and analysis of web usage metrics.

Recently, we've adjusted the key performance indicators that we report to our management away from absolute metrics (such as number of visits to our sites), and towards improving our customer satisfaction. This reflects the fact that our goal is to provide the best possible experience. But 'satisfaction' is much more slippery to measure than pulling a number out of our analytics.

How are people feeling about the National Library website?

This left us with a 'hole' in our methodology, between the detailed (but time-consuming and specific) feedback we get from in-person usability testing, and the broad-reaching (but impersonal) quantitative measures afforded by analytics. This left us wondering: how can we get a sense of how our a broad segment of our audience is feeling about the services we are providing?

A smiley-face for each of the 472 responses to the survey, colour-coded by satisfaction level. Generally people are satisfied except for a small number who were dissatisfied.
How the 472 respondents feel about the National Library website.

The Met an inspiration for a survey

This past December, I read a post called Who Are the Users of The Met's Online Collection? by Elena Villaespesa, Digital Analyst for the Metropolitan Museum of Art. In it, she showed how the Met used a fairly simple survey to get a pretty sophisticated picture of the who their visitors were. It was short, tightly edited, and to the point, with minimal room for ambiguity.

Inspired, we started planning a survey of our own, using the Met's as a starting point. We wrote some similar questions, including 'What best describes the main purpose for your visit today?'. And in our post-survey analysis, we wound up doing something similar to Elana's example of cross-referencing answers to one question with the responses to another.

Our goals

After some discussion, we drafted the following goals for our survey:

  • Understand how happy our audience is right now with our online services.
  • Identify the specific areas where we need the most improvement.
  • Keep it concise and focused, with just enough questions to get some context around our respondents and their motivations.
  • Establish a consistent baseline against which we can measure future progress by re-running the survey in future years, unchanged.

Designing the survey

One major difference between our survey and the Met's was our focus on customer satisfaction, where their main goal seemed to be to learn about the composition of their audience. As such, our survey added a few traditional customer satisfaction survey questions to the mix.

The surveys ran on each of our public-facing websites: the main National Library website, Papers Past, and DigitalNZ.org, with a fourth version customised for the He Tohu mini-site. We did not include our catalogue web front-ends or the in-Library Search Stations, allowing us to focus on general public web traffic.

What we asked in the satisfaction survey

Below is a quick rundown of the questions we asked, and why. Except for the answer choices on question 2, the surveys on each of our sites were identical.

Survey question Why we included it
1. How satisfied are you with your visit to [this National Library website] today? This first question is the cornerstone of the survey. With only five answers possible, ranging from "Very satisfied" to "Very dissatisfied", its purpose is to offer the least ambiguous measure of satisfaction we can get.
2. Which best describes the main purpose of your visit to [this National Library website] today? It was vitally important to know what each respondent was trying to do. However, the National Library site has a lot of use cases around physical visits (opening hours, visitor information, current exhibitions, etc.) that are meaningless on sites not tied as strongly to a physical location, like Papers Past. As such, while this question appeared on all of the sites, its multiple-choice answers were tailored to each site.
3. Which of the following are true? This tick-as-many-as-you-want question was about outcomes, with options ranging from "I did not find what I was looking for" to "I found something better than I was looking for" to "I had no specific agenda in mind when I visited". This allowed us to look for connections between outcomes and satisfaction.
4. In which languages should the materials you are seeking be written? Determine how many people were looking for English language materials and how many looking for Māori or other languages, and if there was any difference in the satisfaction levels of those groups. This was a surprisingly difficult question to phrase concisely, and still feels a bit awkward to me—rewrite suggestions welcomed!
5. Which of the following best describes you? A quick "Who are you?" to determine how our respondents identify themselves. Very useful for distinguishing professional researchers from casual amateurs.
6. Please rate the following attributes of our website. A simple 1 to 10 rating of these attributes: Accuracy of information, Meeting my needs, Quantity of content, Quality of content, Layout/design, and Ease of navigation.
7. How often do you visit our website? Who is struggling more: regular users or first-time visitors?
8. Any other suggestions about how we can improve our website? We had hesitated about adding this question, fearing we might be overwhelmed with queries that expected a response. But before launch, Fiona Fieldsend, Manager Digital New Zealand, requested we keep the open comments, arguing (correctly, as it turned out) that too much feedback would be an excellent problem to have. The insights and extra context that we got from this question proved invaluable.

We worked hard to keep the questions brief and focused, even though there was much more we were keen to ask. One last minute deletion was a question about physical usage ("How often do you visit the library in person?") due to doubts about how much insight it would really give us.

Once we had reviewed the survey with various groups throughout the Library, we posted the appropriate version on each site and linked to it from a message banner. The National Library, He Tohu, and DigitalNZ surveys ran for three weeks in April, while the Papers Past version ran for two weeks in July.

Response rates not high

Response rates were not stellar, at about 1 response per 100 visitors, but we gathered 472 responses across the four sites, enough to give us reasonably solid results with about a +/-9% margin of error. The He Tohu survey only elicited a handful of responses, so we combined those results with the National Library site for reporting purposes. We hope to investigate ways to increase the response rate (and therefore the accuracy) next time we run the survey.

What we learned

When analysing the results, we lean pretty heavily on the initial question, as it was set up to be elicit a snap response: How satisfied are you with your visit today? Given the margin of error, the responses work best on a 1 to 10 scale: the National Library site rated a 7 out of 10, DigitalNZ.org an 8/10, and Papers Past got top marks with 9/10.

Infographic displaying the average satisfaction scores for each website, as described in the paragraph above.

Characteristics and behaviours of the most and least satisfied respondents

Overall satisfaction 8 out of 10. Not too bad. We then used the responses to this first question to identify the characteristics and behaviours of the most and least satisfied respondents. Throughout the reports we wrote to explain the results, we created a color scale from dark green (most satisfied) to dark red (least satisfied) based on each respondent's answer to the first question, and used that scale to build the bar charts for each answer to the other questions. In practice, this is how it looked:

Example of a bar chart of aggregated answers to the Main Purpose question, coloured with satisfaction levels.

What we learned across all of the sites

  • First-time visitors reported slightly lower satisfaction levels than regular, repeat visitors, indicating we might do a better job of orienting newbies.
  • In general, professional researchers (a group including academics, genealogists, authors/publishers, and librarians) were less satisfied overall than self-identified amateur researchers.
  • 96% were looking for materials in English, 11% in Māori, and less than 5% in any other language (Respondents could choose more than one language). Satisfaction scores were higher than average in the non-English group.
  • Less than one out of seven respondents reported having no clear goal for their visit. Most people arrive with a clear goal in mind.

What we learned on the National Library website

  • More than three out of four respondents reported that they were looking for collection items (images, particular items by title, or researching a topic), confirming what we've observed from traffic analytics.
  • Topics which came up repeatedly in open discussion included difficulty with image ordering and accessing high-res images, difficult navigation to catalogues, and frustration at our logins and passwords.

What we learned on the DigitalNZ website

  • A majority of respondents were researching people or family history. Every single visitor who reported being 'very dissatisfied' with their visit was in the 'I'm researching a person group'. The open comments offered a hint that the fuzziness of the DigitalNZ search may be making exact name searches difficult for this group.
  • Topics coming up frequently in open discussion included: navigating search results, clarity of rights statements, and searching for particular people.
Diagram showing the connection between genealogists using DigitalNZ, their relatively low satisfaction scores, and the open comment that gave us the insight into the nature of the problem.
A comment from a genealogist telling us that the DigitalNZ search is not working for them.

What we learned on Papers Past:

  • The Papers Past audience is incredibly loyal. Nearly 2 out of 3 of respondents report that they use the service at least once a week (compared to about 1/4 on our other sites).
  • Six out of ten respondents were researching a person or family, with most of those researching their own family history. Researchers of particular topics or historical events accounted for 11% each, and 5% of visitors were researching a place.
  • There was an almost equal split between those identifying as 'curious amateurs' (43%) and professional researchers (45%), with a plurality of the latter being genealogists (27% of all respondents).
  • A small but vocal group of visitors reported their primary goal was solely to find out what materials had been added lately and to work their way through those items.
  • Topics coming up most frequently in open discussion included: More papers/dates, please; Better printing and saving of bookmarked articles; Adding the ability to correct OCR text; More advanced search options (such as wildcards); Difficulty returning to search results from article pages; Improving date pickers; and Adding search filters to mobile views.

What happens next?

A detailed report on the full results has been shared with the Digital New Zealand team, and the areas of greatest dissatisfaction identified by the survey have been targeted for improvement in the team’s 2018-2019 Year Plan. We're working with our development team and our colleagues to think about solutions to the issues highlighted.

We will reprise the survey in April 2019 to see if we've improved things. Look for more insights next year.

In the meantime, if you have any questions, please feel free to get in touch with Michael Lascarides at michael.lascarides@dia.govt.nz.

By Michael Lascarides

Michael is the User Experience Manager for the National Library.

Post a Comment

(will not be published) * indicates required field

Be the first to comment.