top of page

Designing an onboarding experience for a AI product

Relevance AI is a no-code platform to analyze unstructured data

💎 Problem analysis

Based on the clarified metrics and goals, the primary question was: "Why don't users click on visually highlighted Fast Apply jobs at the top of the page?" This was our approach to problem analysis.

Learning from different sources

We began by analyzing user behavior data to identify general problem areas. Similar to reviewing a health exam report, this data provides insights into areas of concern. As problem-solvers, akin to doctors, we initiated the diagnosis process. We discovered two key problem areas:

  1. The percentage of newly signed-up users (within two weeks of signup and onboarding) who clicked on the Fast Apply job section was unexpectedly low, at only 7%.

  2. Even fewer users clicked on the 'Fast Apply' tab. (please refer to the image below)

To address these significant issues, both for the business and user experience, we needed to delve deeper and understand the underlying causes.

1. Zoom Out: Analyzing User Behavior Data

On the Arc platform, we have two types of job postings. The first type is Fast Apply jobs, which come from our employer clients and represent our core business model as a job board platform. The second type is External jobs, which we gather from other job boards.

Previously, in an effort to increase user views and applications to Fast Apply jobs, we implemented a design solution that involved highlighting those jobs on the user's dashboard. However, the numbers remained low. It was clear that we needed to iterate our design based on what we had learned.

Below is how the previous design looked like.

Conducting user interviews with a focus on the identified problem areas and users' behaviors, we engaged with two distinct groups of candidates:

  1. Previously signed up users who had never clicked on the Fast Apply section.

  2. New users who had not yet signed up for Arc.

While I won't delve into the specific details of our interview methodology, the feedback we consistently received from both groups can be summarized as follows:

3. Deep Dive: Learning from Users

"This section... it looks like advertising because it's highlighted "

This feedback struck a chord with us. It was perhaps overly optimistic to expect that highlighting something would automatically capture users' attention. There was nothing inherently wrong with our design principles, but we had failed to consider the users' actual product experience within a broader context. Users had encountered numerous products, particularly job board platforms, that highlighted elements primarily for the companies' business growth rather than for enhancing the user experience. Consequently, users had developed a habit or behavior of simply ignoring such highlights.

Through user session recordings using Fullstory, we discovered that the majority of newly signed-up users skipped the Fast Apply job section and immediately scrolled down to the job list ('All jobs' tab) upon landing on the dashboard for the first time. Furthermore, after browsing the 'All jobs' list, most users clicked on external jobs rather than Fast Apply jobs. While we could speculate that the 'Fast Apply' tab was not immediately visible to users, we couldn't understand why they would disregard the Fast Apply jobs section, which contained more appealing and detailed job data compared to the external jobs.

2. Zoom In: Observing User Behaviors

🌈 Solution design

After a few more design iterations and user testings, we finally landed with the deign as below.

Turning insights into solutions & outcome

✨ The objective

Before delving into problem analysis, it was crucial to establish a meaningful and concrete goal to approach the problems correctly. For example, the exposure rate of Fast Apply jobs to users was initially a key metric that began to decline. However, we discovered that this decrease was due to the increased relevance of exposed jobs, resulting in higher view and application numbers. This finding helped us redefine the key metrics and goals that truly mattered.

Another consideration was determining which data to track and analyze. Should we analyze only new users' behavior data? Time period or session numbers to define new users? How about active or inactive users? and many more questions.

Redefining the right success metrics first

This solution may appear simplistic (and indeed, it is), but its simplicity is a result of the comprehensive problem analysis we conducted. The outcomes achieved from this straightforward solution were remarkably significant:

  • Fast Apply jobs click rate increased by 173%

  • from 7.9% to 13.74%.

While there may be more optimal solutions that could have been explored and experimented with given additional time, we successfully resolved the defined problem, enabling us to address other pressing and critical issues. Furthermore, this experience served as a valuable learning opportunity, allowing us to establish our own design principles that can be applied to future design tasks.

* To maximize the impact of our new design, we also put our efforts on redesigning the JD experience to help our users browse and digest the relevant information quickly (see the images below).

bottom of page