top of page

UX Research Case Study

Honey's Hot Sauce Website Usability Test

Introduction

Honey's Hot Sauce is a new product offering meant to compete in the organic gourmet hot sauce market. The original project lasted four months and used the Double Diamond Strategy. Since the original UX Researchers were not available, I was asked to join the team to conduct usability testing on a prototype of the Honey’s Hot Sauce website. I joined for the Deliver phase and collaborated on Usability Testing with a UX/UI Designer from the original team and a new UX Designer.

MH 1.png

Original team

UX Design Lead

2 UX Researchers

2 UX/UI Designers
UX Writer

Front-end Developer

Graphic Designer

User Test Team

new UX Designer

new UX Researcher (Me)

UX/UI Designer (original team)

Project Details

Type: E-commerce website

Test platform: Figma prototype

Duration: 3 weeks

Team size: 3, all remote, different time zones

Participants: 9

Initial Observations

After reviewing the existing artifacts for the project, I had several concerns: 

  • The prototype had some significant issues and was not ready for user testing. 

  • Analysis of the initial research, based on 6 interviews, was not complete and I was concerned there might be missed opportunities.   

Tools/Methods

Surveys

In-person/remote moderated tests

Quantitative/Qualitative Analysis

Zoom

Google Sheets/Docs

Slack

Figma/FigJam

Maze

Objectives

The overall goals of testing were to make sure: 

  • customers could navigate and make purchases without pain points.  

  • the original research accurately defined all of the user’s needs.

  • bias did not affect interpretation of results.

  • actionable issue resolutions are identified

Where I joined - Deliver stage

I really like the version of the Double Diamond Strategy used in the book "Think Like a UX Researcher " because it demonstrates how opportunities for innovation may be lost by skipping through researching user needs. I added some additional annotation in dark red. The remaining sections describe the deliverables created at each step.

DoubleDiamond Graphic.jpg

Figure 1.1 Design Council's Double Diamond Model - Think Like a UX Researcher - Travis & Hodgson

You are HERE

Problem space

Solution space

What was done and who did what

Define UX Research Strategy

  • Define UX Research Goals

  • Results collection methodology

  • Error rating scale

  • Host kickoff meeting

new UX Researcher (Me)

Define Testplan

  • Test script

  • Pre-test questions

  • User Tasks / Follow-up questions

  • Post-test questions

new UX Designer

new UX Researcher (Me)

UX/UI Designer (original team)

Execute Testplan

User Testing

5 in-person moderated tests

new UX Researcher (Me)

2 remotely moderated tests

new UX Designer

2 in-person tests

UX/UI Designer (original team)

Review  Results

  • Review results

  • Assign severities

  • Propose next steps/solutions

UX Design Lead (original team)

new UX Designer

new UX Researcher (Me)

UX/UI Designer (original team)

Share and Present Results

  • Prepare Test Report

  • Present Results

new UX Researcher (Me)

Results

While preparing the test report, I often went back to parts of the initial research to determine how well it met the needs of the user. Since part of research is communicating actionable data, I discovered some missed opportunities and disconnects that could have been addressed prior to testing.

1) Missed insights and opportunities from initial UX Research

Original Research

  • 33% willing to try new sauces

  • 33% use hot sauce on specific foods

  • 44% were not sure of definition

"What do people usually put hot sauce on?"

User Testing Research

  • 100% did not fully understand difference between cooking sauce and pairing sauce.

  • The site made no specific pairing sauce recommendations with specific dishes

"I cook all of the time and never heard of pairing sauces"

"What makes a pairing sauce different than a cooking sauce?"

Solutions

  • Analyze how competitors communicate pairing and refine messaging

  • Add informative UX writing to define pairing sauces and give examples of usage

  • Alternatively, remove reference to pairing sauces altogether

2) Explore/Search & View/Shop navigation

Navigation.jpg

Summary - The current Information Architecture uses similar terms that make navigation difficult.

  • 78% were not sure what "Explore" would do

  • 56% did not understand difference between View All and Shop All

  • 44% wanted to just use search icon to locate a hot sauce

Solutions

  • Adhere to established design patterns and remove "Explore" from navigation

  • Remove "Shop All" and "View All " from left side navigation

3) Recipe UI resembles a CTA button

BAD RECIPE UI.jpg

Summary - 56% thought that that the 'Recipe'  text was actually a button

Solutions

  • Remove green from Recipe element and replace "Recipe" with title of recipe

  • Remove original recipe text

4) Disconnect with branding - Dog & 'Honey'

Original Research

  • Research focused on branding & trustworthiness perception

  • 33% think honey is an ingredient

  • 50% confused by dog on label

Label_Close.jpg

User Testing Research

  • 44% think honey is an ingredient

  • 56% did not relate dog to a hot sauce

  • 22% did not see apostrophe in Honey's

"Honey's?, is the sauce sweet?"

"I never connected that Honey is the dog's name, I missed apostrophe"

"Why is there a dog on the label?"

Summary - Further research on the branding strategies is needed beyond analyzing trustworthiness to measure how well this product connects with users. Since the dog belongs to the owner of Honey's Hot Sauce, personal bias may have influenced previous branding strategies. Furthermore, there might be an opportunity to create a gourmet hot sauce that stands out with a special ingredient, such as honey.

Solutions

  •  Complete additional research to determine viability of current branding strategy

  • Review viability of marketing hot sauce with honey as ingredient

5) Rainbow tracking sheet could not accomodate duplicate issues

Duplicate issues

DUPLICATES.jpg

Summary - Duplicate issues did not reference each other creating inflated total of issues observed

Solutions- (shown below)

  • A new column was created to indicate issue was a duplicate and DUPLICATE with original issue number was added to description.

  • Tab created to provide instructions on how to log duplicate issues

Rainbow_Dups.jpg

Duplicate issues column

Duplicate issues column uses color code to highlight closely related issues

fixed_Rainbow.jpg

6) Issue write-ups are sometimes too vague and do not reference specific screens

Example : A specific screen IS NOT referenced in any of these issue descriptions:

"Recipe link on explore page was not working"

"Mango Habanero page: Graphics under "Details"styles don't match (some have a stroke, some don't)"

"Write a review button enabled when user is not logged in (user is confused as to why she's able to write a review when she's not even logged in)"

Summary - During review of test data, a lot of time was spent trying to find the screen related to a specific issue.

Solutions

  • Provide specific instructions with examples (correct & incorrect) of how to write up issues and require issue specifies  screen title/number

  • Peer review issue write up prior to reviewing results

7) Team member missed deliverable deadline and did not communicate with team

Summary - A member of the User Team did not have their data entered into the tracking prior to the meeting and failed to notify team ahead of time. At the start of the meeting, the member indicated that they saw similar issues. The Issues were then entered after the meeting but I was concerned this might introduce bias into their data or that valuable observations were not captured.

Solutions

  • Establish team norms for communication and ownership of responsibilities

  • Create check-in prior to meeting to make sure everyone is on target

bottom of page