Boothfinder

Aligning the team around customer centric development

Summary

This project began as a collaborative workshop I conducted with representatives from Product, IT, our Service Center, and Executive leadership in order to gain alignment, which resulted in us prioritizing a redesign of the top end of our funnel to solve a major impediment to exhibitor ordering. The outcomes of that effort were that we increased boothfinder single session completion rate by 11% while reducing the number of times viewed per completion from 15 to 5. Along the way, I created flow/logic diagrams to help illustrate desired behaviors, wrote front end markup and HTML and worked with developers to fine tune implementation, as well as conducted user research at multiple stages to allow us to develop in a lean environment.

Boothfinder Completion

11%

Views Needed per Completion

66%

Feature Prioritization

One of the overarching challenges I faced with this platform was to define and articulate a clear vision for the product, and to get buy in from the various departments who dealt with it. Everyone had ideas about which features should be introduced or replaced, so how could we decide what our roadmap should look like?

To remedy this problem, I conducted an affinity diagramming session with senior members of several interested departments, as well as all members of the product and development teams. I brought in recorded sessions of users performing tasks on the site, and had team members watch them and write down their observations on sticky notes. All notes were then placed on the board, and we worked as a group to categorize them into a natural taxonomy. This produced clear, weighted categories that had input from everyone in the group, which allowed us to prioritize the projects we would tackle based on direct observation of user behavior and with buy in from all the departments involved. The 'BoothFinder' feature emerged the clear priority.

Interview script with notes

Defining the Problem

From the user tests we recorded for the diagramming session, and from my previous observations during user testing, I was able to discern three major patterns within the BoothFinder feature that were keeping users from completing it, and thus halting their progress at the top of the funnel.

First, the continue button was hidden until the user had filled out all the requesite fields. This kept the user from forming a clear path of how to progress through the process, since they didn't have the opportunity to find out what fields they may have missed by getting validation errors.

Second, the existing design asked them to make several choices on the same page. This increased cognitive load, and thus error rate.

Third, much of the information was hidden below the fold within a modal. Users rarely expect to scroll within a modal, so they often missed these fields completely.

Gathering Requirements

Since different regions had different requirements, I conducted stakeholder meetings with the relevant people in those regions, as well as with the product manager. The US was the most complicated, since the tool would need to be able to check for exhibitor list information and present the search screen only if it existed. EMEA needed the user to confirm or correct the info presented if the given dimensions were 0 x 0, which happened when the info was absent earlier in the process.

The BoothFinder served multiple purposes, some of which were specific to a particular region or were present in some circumstances but not others. This made it necessary to define the conditions under which certain things would be shown to the user.

I created a basic logic diagram to articulate the sequence of conditions and user actions that the feature should present, and where the relevant information was coming from that would dictate the logic of the feature.

Best Practice Research

In order to make the best product possible, I conducted some general best prctice research for onboarding flows. I knew the general problems with the interface, but I wanted to set a few guidleines for myself when defining the solution.

The first was that the user should be able to form a clear picture of what to do next at all steps. The hidden button on the previous design kept the user from knowing what to do next, or from finding out what they needed to do in order to satisfy the conditions of the current screen. I decided that it was preferable for the user to click continue and receive a clear validation error that told them what to do next rather than not be able to progress.

Secondly, the user should be presented with one clear, primary choice at a time. This would reduce cognitive load and allow the user to more quickly form a clear plan of action.

Third, necessary information should be presented in clear, concise chunks that were always visible on the page, even if that meant that an extra step was necessary. It would be easier for the user to make a series of single decisions rather than having to scroll and make multiple decisions on the same page.

I reworked every screen to emphasize a clear primary choice. Since the login screen could be shown if the user was not currently logged in, I started there.

Prototype Testing

I created the designs and took them through a few quick iterations with stakeholders to make sure they met business needs and other designers to get critiques. At this point, I wanted to find a way to put them in front of actual users before we spent months developing them. Since I wanted to be able to test a few major elements of the interaction design and be able to get deeper insights from users at the same time, I decided that moderated testing would be ideal.

I used invision to link together the screens I had created in order to get the general feel of a stepped process. User testing would need to be remote to best reach our actual users and keep down costs, so I decided to use Zoom, our video conferencing system, which allowed me to show the prototype, give the user control of the mouse, and record the session. Once I had walked through the setup with a colleague, I created a script and began recruiting from existing users of our site through email.

User Testing with InVision gave us confidence that users could at least form a basic plan for how to navigate the feature. Throughout the development process, we would continue to user test when possible to test the aspects of the feature that were not as easy to represent with an InVision prototype..

We found a few changes through the user testing process that helped users understand the information they were presented a bit easier, and incorporated them in to our development. I wrote the HTML and CSS for each view, and worked with the development team to get them implemented into the Knockout and .NET framework.

Rollout and Post Release

Before we went live in production with the change, we created an introduction site to familiarize our call center and internal stakeholders with what would be changing, and allow them to try it in our UAT environment. We wanted to make sure that we weren't rolling out major changes without supporting the changes they would cause internally.

The new boothfinder design was incredibly successful. The comnpletion rate of the feature immediately spiked by nearly 15%, and settled to a level roughly 10% above the benchmark that had previously been set. In addition, the increased ease of use at the top of the funnel had effects that manifested through the site and resulted in increased adoption and higher overall order quantity.

Online order share, Aug 2015

63%

Online order share, Aug 2016

75%

Retrospective

This project was a sort of bookend to the checkout project, the combination of which culminated in a %48 year-over-year increase in revenue through the tool, grew the market share by roughly 4%, and allowed us to begin to use the tool internationally. In particular, since the feature affected users at the top of the funnel, it dramatically increased thhe amount of users who chose to purchase online, since they were not turned away at their first interaction.

Working on this project required me not only to leverage my skills in user research and design, but to grow beyond them in several ways. Conducting workshops with team members and stakeholders allowed us to gain internal alignment on our priorities and act as a coherent team across departments. The javascript library that we used to build the feature in production made it more difficult to measure results using simple analytics goals, so to get these measurements I learned to use Google Data Studio and taught myself javascript in order to create custom events.

Previous
Previous

Building a data driven process - Checkout Redesign