BigRoad Freight

Optimizing the Onboarding Experience through Evaluative User Research

Where are all our users going?

In my first three months working with FleetOps, I worked to increase user conversion through the onboarding flow and long-term adoption of the BigRoad Freight load-board.

I utilized mixed research methods, including Mixpanel data anylitics, Hotjar user recordings and heatmaps alongside 1:1 user outreach to test and implement changes that led to a 70% increase in onboarding conversion.

Wait…what’s a loadboard?

Just in case you’re not familiar with the logistics & freight industry, here’s a little context.

FleetOps is an AI integrated SaaS company that integrates with the onboard Electronic Logging Devices (ELDs) on commercial freight vehicles to deliver personalized load options to carriers operating on the spot market.

In other words - FleetOps has created a smart load-matching platform that meets truck drivers where they are (in their cabs) to deliver an alternative to the traditional freight marketplace model.

In my time working with FleetOps I worked on their carrier (truck driver) focused platform, BigRoad Freight, to create an experience that was more efficient, transparent, and profitable than the existing alternatives.

Here’s the methods I used to make the magic happen 👇

Find the leak in the funnel

When presented with the problem of high traffic but low usage, my first step was to dig into the analytics to identify the leak in the onboarding funnel. Using the data analytics tool Mixpanel, I pulled the following reports and began investigating.

Onboarding conversion report - indicating a new user conversion rate of 6.46%

New user retention report - indicating a significant usage drop-off at the 7 day mark

Funnel report - showing conversion by user type at each stage of the onboarding flow

Onboarding Conversion Report

Shows users launching the app for the first time and land on the first page of the onboarding flow, compared to the users who complete the onboarding flow and enter the app.

New User Retention Report

Shows the percentage of new users who return to the app over time and perform a specific action, in this case to search for a load.

Onboarding Funnel Report

Shows the percentage of new users who complete each step of the onboarding flow, separated by user type.

Found the leak, now what?

By reviewing the Mixpanel data, I was able to identify that the ‘leak’ was occurring in the onboarding process. This is shown most clearly in report one, which illustrates the amount of users that fall off between opening the onboarding modal and completing the modal. Report three shows each step of the onboarding flow, indicating the amount of users that complete each of the steps in the modal. This report illustrates that even at the first page of the onboarding modal, users are navigating away from the platform.

With the ‘leaky area’ identified, my next step was to uncover what may be causing users to quit when encountering the onboarding flow. For this, I first turned to Hotjar to review unmoderated user behavior on the pages in question.

Hotjar heat-maps of the onboarding flow indicated users understood the modal & could navigate it easily.

Recordings of user sessions through the onboarding flow showed the same - even users that failed to complete the flow were navigating it smoothly.

I used a hotjar survey, triggered when users left the onboarding flow before completing to begin gathering user reported insights.

So it’s not a usability issue…

Reviewing user behavior through heat-maps and recordings revealed that users were not encountering any usability snags. Even recordings of users that failed to complete the onboarding flow showed patterns of behavior consistent with a smooth user experience. They navigated the modals quickly and smoothly, and did not display any signs of confusion that would indicate usability or navigational issues.

I used the insights that I generated from this quantitative research to structure user interviews to identify why this drop-off may be occurring, and what we could do to fix it.

I segmented the user outreach into dropped users & retained users, and conducted 30 minute semi-structured interviews with 30 individuals. In conversations with both subsets of users I covered their onboarding experiences, but also touched on their expectations upon opening the BigRoad Freight app, their experiences with other load board apps as well as their workflows and day-to-day experiences in their jobs.

Dropped Users

  • didn’t complete onboarding due to “too many steps” or an “unnecessarily complex” onboarding process

  • stopped using when they didn’t see “good” loads in first search

  • would find product more useful if onboarding was shorter & there was more freight in the system.

Retained Users

  • reported that the biggest value proposition for the product is it’s free and convenient location in the ELD

  • agreed with dropped users that the lack of ‘good’ loads in the system is a real barrier to regular usage

Key Findings

  • both retained and dropped users report that the onboarding process feels like a barrier to entry

  • many users compare the onboarding experience to other load-boards where you can ‘try before you buy’ and report that they were expecting something similar on their first visit

Let the experimenting begin!

After synthesizing the qualitative and quantitative research, it was clear that users were asking for a shorter, less intrusive onboarding process that got them right to to content in as few steps as possible. One user put it best saying, “Don’t put this big long thing in-between people and the thing they came to see.”

I structured the next steps with this insight in mind, creating four steps to get the user closer and closer to the main event. I partnered with the engineering team to implement a rolling series of experiments.

We began with the lowest-lift experiments, and let each run for a period of two weeks. After each two week period I reviewed the onboarding anylitics to determine if an additional step was needed.

Here’s a peek at what those experiments looked like 👇

Experiment 1 - Progress Bar

My first step was to work with the UI team to add a progress tracker bar to the onboarding flow, to illustrate that it was just three steps & help address user concerns that the flow was too long

Experiment 2 - Combine Pages

I continued to work with the UI team to implement small changes to the onboarding modals - this time combining two pages of the old onboarding flow to remove a page from the flow.

Experiment 3 - Lead With Value

In our last ‘low lift’ UI experiment - I added a key value proposition to the loading screen. Capitalizing on previously unused screen real estate & (hopefully) increasing user buy-in before they encounter the onboarding flow.

Experiment 4 - Straight to Search

After testing three low-lift UI adjustments with minimal improvements to the conversion rate, it was time to try something a little more drastic.

The final experiment was an A/B test, designed to test giving the users exactly what they asked for: skipping the onboarding flow and going directly to search.

Here’s how I did it👉

This exerpt from the experiment Confluence doc shows the setup of our split test experiment - routing 50% of our visitors straight to the search page, and allowing them a set amount of interactions before prompting them with the onboarding flow.

The results of this experiment (straight to search experience shown in orange) clearly demonstrate that users responded well to our hypothesis. When users experience the product before being prompted to onboard - they are much more likely to complete the onboarding flow.

Experiment 4 - Results

Learnings & Next Steps

This research effort was unique in that I had access to the time and resources to conduct a methodical research effort in the ideal order. I was able to begin with qualitative page-view and funnel analysis of the entire onboarding experience, and was provided heat-mapping and user recording tools to confirm the early qualitative findings before moving on to gathering qualitative insights through interviews.

I look at this research effort as a ‘best case scenario’ that I have rarely replicated in other projects.