User Research
Usability Testing

Ontario.ca Search User Research

Ontario.ca is the official website of the Government of Ontario, serving as a central portal for residents, businesses, and visitors to access a wide range of government services, information, and resources.

The Experience Design Lab is an internal team within the Ontario government, provides UX research and design support to various ministries. I participated in this project when I was a Co-op intern of the lab. For this project, our role was to conduct user research to evaluate the effectiveness of search result displays on Ontario.ca, and to test proposed changes.

Project Background

The over-due research to contextualize survey findings

There has been ongoing effort by the design team at Ontario.ca to improve the website's internal search functions, such as through embedded on the search results display page (image in the left/above).

However, no user research had been conducted. Therefore, the project's goal was to contextualize survey feedback on the internal search function and to test potential changes to the search results display page.

Project at a glance

Project Context
  • Part of my Co-op work term at Experience Design Lab, Ontario public service
  • Duration: May 2024 - August 2024
  • Team: Ontario.ca Search Team (Client Team), Senior Service Designer (oversight), Experience Design Co-op (me and the other Co-op stuent)
My Responsibilites
  • Created moderator's script and prototype
  • Scheduled and conducted user sessions (4/7)
  • Analyzed research results and create finding's guideline
  • Communicated with the client/product team
Project Goal
  • Evaluate the overall effectiveness of Ontario.ca search for selected topics and test proposed changes
  • Explore what influences people's decisions/behaviours during the search process
Project Outcome
  • 8 global findings + 12 feature specific findings

Research Planning

Creating the user research plan and deciding on methodology

As part of our organizational practice, the first step was to create a user research plan to align with the client teams on research goals and objectives. After reviewing the current information and desk research provided by the client team, along with our internal discussions, it was decided that the sessions would include three parts - a mix of semi-structured interview questions and pre-set tasks and user flows for participants.

1. Interivew Questions

aimed at understanding the participants' background and history with search

2. Pre-set search tasks

to observe how participants search in real time and what influences their decisions and behaviours during the process

3. Review prototypes

to test the proposed changes on the search results display page using Figma prototypes

Developing my first "formal" moderator's script

This search project was my first experience working on a public-facing initiative, unlike my previous school projects. To draft the moderator's script, I reviewed several scripts from past team projects. However, since this was our first user research on search, there weren’t many similar references.

Initially, I wasn’t sure if I was on the right track. My fellow co-op student suggested consulting with the senior designer overseeing the research to ensure we were on the right path. This experience highlighted for me the importance of seeking guidance when needed, to help align goals and improve efficieny.

Prototype

Next, we prepared prototypes of the search results pages in Figma, incorporating the proposed design changes to present them to users during the sessions.

The constantly evolving prototype scope

The scope of this prototype evolved significantly. Initially, we planned to create some simple, static screens for users' verbal review and feedback. However, after meeting with the stakeholders, we decided the prototype should be interactive. Later, the client team also suggested adding a mobile version to observe users' search behaviors on mobile.

Making some commonly used components

As the scope expanded, I decided to turn commonly used elements into components to ensure we could make systematic changes easily, given the large number of screens. This approach was effective and served its purpose well.

Commonly Used Components
Component set for content block

The highly-collbarotive prototype process

This prototype process was highly collaborative. The other co-op students and I divided the work between desktop and mobile, and the client team often joined us to provide inspiration and suggest changes as well.

Through this collaboration experience, I learned that everyone’s approach to working with Figma or design can vary, which means it is important to stay flexible and open-minded. Additionally, I realized that a prototype for usability testing doesn’t have to be pixel-perfect; what matters most is that it functions effectively and helps us achieve our research goals.

Desktop Flow
Mobile Flow

Recruitment & User Interviews

The user sessions consisted of a mix of semi-structured interviews and pre-set tasks, lasting around 45 minutes and conducted remotely through Zoom. We also asked participants to share their screens while completing the tasks.

Set-back with recruiting participants

The partner team provided us with a list of participant contacts. Initially, we sent the recruitment screener to a limited number of people who were more aligned with the topics of our designed user flow. However, we were unable to secure enough participants given the small screening pool.

Later, we changed our approach and sent the screener email to everyone who had provided their contact information. This allowed us to gather more responses, ensuring that our participants represented a diverse range of factors, such as geographical location.

Facing the unexpected during live sessions

Dealing with the unexpected can be challenging. During one of the sessions I was facilitating, the participant experienced Internet issues in the middle of our session. Under the direction of the senior designer, I first attempted to resolve the issues a few times before letting the participant know that we might have to end the session and follow up later. Fortunately, we were able to reschedule the participant a few days later.

I had encountered technology issues at the start of sessions before, but not in the middle. However, this experience taught me valuable lessons on how to handle situations like that staying calm, being adaptable, and effectively communicating with participants.

Redirecting participants back to the "Search" task

In nature, search is a highly integrated action, which often leads people to comment on the content of the page rather than focusing solely on the search task. Initially, it was difficult to determine whether to interrupt and redirect them to the search results, which are our primary research focus. Over time, I became more comfortable making judgment calls based on the relevance to our research interests while also being mindful of time constraints.

Data Analysis

Co-analysis session to synthesize initial themes

As someone new to user research, my initial instinct after the sessions wrapped up was to review the recordings to fill any gaps in my note-taking. However, the team suggested that we hold a co-analysis meeting first, as it might be a more efficient approach.

Therefore, we conducted a co-analysis session where we reviewed the findings as a team and generated basic themes. As an emerging designer, I was glad to gain the experience of learning how to use recordings strategically.

Further digs into data

After the co-analysis session, the other co-op noted that there was still a need to dig deeper into the data. Therefore, we did a basic cleaning of the data to get a clearer idea of which specific quotes or stories we might use in the reporting slides. To make it easier to compare participants' thoughts on the proposed changes, I also suggested creating a table organized by participant and their comments on the proposed features.

Report & Presentation

Blancinging the comprehensive and conciseness for report-writing

We structured our report slide decks into global findings and feature-specific findings. Since the basic themes had already been developed during the co-analysis session, I focused on describing the themes in more detail and supporting them with direct quotes from the interviews.

Making the descriptions both comprehensive and scannable. I reviewed several past slide decks to gain a better understanding and also used our newly launched enterprise-wide AI technology, Copilot, to help brainstorm ideas and proofread.

Presenting our findings

We first presented the research findings to the product team. Since it was the developers' first time learning about the research, I paused for questions and screenshared our prototype to provide context. When delivering recommendations, I tailored my communication to the audience, considering the product owner's strategic focus and the developers' technical concerns.

Additionally, we presented to the entire experience design team within the department. I made sure to pause, monitor the chat in Teams, and address any questions or comments in real time. Both presentations went smoothly, allowing for productive discussions and clear next steps.

Reflection

This project is my first as a Co-op in the lab and my first professional experience as a UX researcher/designer. Throughout the process, I've focused on reflecting on my experiences. Here are some key learnings:

Be comfortable with ambiguity

Unlike the more straightforward 'sign-up process' task, the search task is more dynamic and ambiguous. This experience ultimately helped me become more comfortable with ambiguity. I learned to adapt quickly to unexpected situations and try to embrace the fluidity of the process.

Be ready to walk people through the tech set-up

With many people familiar with Zoom post-pandemic, we initially thought a detailed guide wasn’t necessary. However, when several participants still needed support, I realized I should have had one in hand. This taught me to always be ready to provide a full walk-through of the technology during virtual user sessions.

It is ok to interrupt people when necessary

During my first few dry-runs and interviews, I was hesitant to interrupt, which often left us short on time for the final part of the session. Gradually, I learned to step in when participants went off-topic and used sticky notes to remind myself when to move on to the next section, helping me manage time more effectively.