UCI Library Search Engine User Study

Project Goal: Gauge user feedback for the proposed redesign of the UCI Library Search Experience though user research studies, then recommend design changes to improve user satisfaction based on findings.

Role: Lead User Experience Researcher

Methods: Stakeholder and User Interviews, Heuristic Evaluation, Cognitive Walkthrough, Card Sorting, Usability Testing, A/B Testing, and Surveys

Highlight: Moderated user testing sessions and led data synthesis

Outcome: Our redesigned experience garnered 100% a satisfaction rate and preference over the proposed design. Recommendations were adopted on live site.    

This building, CalIT2, is where the usability testing took place. It is a joint research facility shared by UC Irvine and UC San Diego. 

In Summer 2014, the UCI Libraries prepared to launch their new search engine to tens of thousands of users. I led a team of four to evaluate their redesign and make recommendations for improvement. Within weeks, my team created a rapid prototype that achieved a 100% user satisfaction rate.

Study Objectives

  1. Measure if users are able to locate media using the UCI Library search engine.

  2. Observe how many steps and how long it takes to locate an item.

  3. Note how well the resource tabs serve users in locating items.

  4. Conduct usability tests of the new search engine with students, students, librarians, and professors

Methodology

Since the primary objectives targeted the usability and efficiency of the search engine, I initially employed the following methods:

  • Task Scenarios: Understand how users interact with the system to complete primary tasks.
  • Think-Aloud: Verbalize thoughts to understand the strengths and weaknesses of the user experience
  • Heuristic Evaluation: Identify user interface issues with the search engine based on design principles.
  • Cognitive Walkthrough: Work through tasks to understand learnability for new users. 
  • Semi-Structured Interviews: Conducted post-testing debriefs to gauge feedback and measure satisfaction

Here is the equipment utilized during usability testing sessions.

Findings

The Usability Testing revealed two themes: user interface confusion and lack of user assistance.

 

Interface Issues

Users struggled to locate resources via the search engine due to its monochromatic design.

Over half of users did not realize that there were additional links in other tabs because the "Found 0 matches" message was so prominent

Inconsistent highlighting, sometimes in yellow and sometimes bolded, of search terms between different tabs puzzled users. 

Test subjects were frustrated that the back button always resulted in an error rather than returning them to the previous page within the search engine. 

 

Lack of User Assistance

First time users were confused by lack of explanation regarding some parts of the system.

In the case above, misspelling "Informatics" yielded just two results though there are over 10,000 results when spelled correctly. Users expressed desire for spell-check. 

Unfortunately, many first time users did not know that ANTPAC and Melvyl are library catalogs, leading them to be unsure of which one to choose. There were also large discrepancies between how many resources were available in each catalog without a visual indication to assist users. 

Users conducting preliminary research for projects voiced the need for suggested topics to help them get started when they did not know what terms to search. There was also no filter to narrow down results, which would be particularly helpful for searches that return 10,000+ results.

 

Rapid Prototype

Having evaluated the study objectives using a variety of research methods and users, it was time to create a prototype that addressed the newly discovered issues.

In one week, I worked with our designer to create an interactive prototype of a more usable and elegant search engine. Given the research objectives and time constraints, we prioritized the main complaints by implementing the following:

  • The volume of results is reflected via tab color. For example, if a tab has 0 results, the color grays out. With more than 1 result, the color is orange. Users can quickly identify which catalog contains the most results.

  • Consistent highlighting between tabs.

  • Removed unnecessary page elements.

  • Dynamic search: as user types in different keywords, the engine will generate results in real-time.


Impact

Both previous and first-time usability test participants were delighted with our redesigned search engine prototype. They praised the minimalistic design, aesthetic color scheme, and clear display of information.

100% of tested users reported satisfaction and said they would recommend the system to friends. 

We were particularly flattered by an interviewee who said "please find a way to implement this as soon as possible!"

At the time of the study, the search engine was one month from going live. The development team told us they did not have time to conduct usability testing and expressed appreciation to us for sharing our findings.

Ultimately, aspects of our feedback was implemented into the live product as shown in the screenshot above. 

This study underscored the importance of following User-Centric Design principles early and often to both meet stakeholder objectives and delight users.