IBM Tealeaf Predictive Analytics - Summer 2015

I spent summer 2015 as a product design intern at IBM working on IBM Tealeaf CX Mobile. Tealeaf is SaaS and OnPrem based customer analytics software used by Fortune 500 companies including: Target, Dell, Air France, and Wells Fargo. The software takes in raw user data from the mobile apps and websites compiles the data into reports based on events set up by Tealeaf users. Tealeaf is used differently by every company that owns it. Popular use cases include, bug detection, customer segment analysis, fraud detection, and A/B testing. 

The Project

I was tasked with creating a feature for Tealeaf that uses historical data to make predictions about customer behavior. I was responsible for user research, ideation, conceptualization, wireframing, user testing, and high fidelity mock ups of the feature. 

My feature aimed at solving several problems by using predictive analytics:

1. Tealeaf is complicated and requires a steep learning curve

2. Tealeaf shows a lot of data, it is hard for a customer to find just the information they want

User Research: Austin Event

User research for my project was done through a customer discovery event that I planned and organized. Customers from PNC Bank, Dell, Dollar Bank, and Rooms To Go flew to Austin, Texas to take part in the 2 day event which included design exercises and brainstorming sessions. 

Personas

The personas, Kristy and Tom, belonged to the IBM Customer Analytics persona collection. The users at the Austin event embodied the responsibilities, challenges, goals and pain points of our personas. 

Austin Results

A Tealeaf user's work flow was a in drill up, drill down pattern. Inconsistencies were identified in charts on the dashboard. The user drilled into the charts, segmented the sessions by different breakout criteria, and then replayed sessions until a abnormality, or "customer struggle", was noticed. The user then drilled back out and checked if any other sessions experienced the same customer struggle. 

The process was overall a very labor intensive and time consuming guess and check method. Users wanted an easier way to gain valuable insights. 

Sketches

Using PostIt Notes to explore possible user paths through the software to accomplish different goals.

Using PostIt Notes to explore possible user paths through the software to accomplish different goals.

Brainstorming different data visualizations.

Brainstorming different data visualizations.

An early concept for a "struggle dashboard". Predictive analytics would be used to recognize customer patterns of struggle and aggregate them to a dashboard. 

An early concept for a "struggle dashboard". Predictive analytics would be used to recognize customer patterns of struggle and aggregate them to a dashboard. 

Playing with the idea of aggregated user "paths" through the interface. Using historical data, search algorithms would discover customers that experienced similar struggle situations and display them as paths allowing the Tealeaf user to see what went wrong and where. 

Playing with the idea of aggregated user "paths" through the interface. Using historical data, search algorithms would discover customers that experienced similar struggle situations and display them as paths allowing the Tealeaf user to see what went wrong and where. 

More fleshed out version of aggregated user path idea. A score was added to determine the level of struggle. 

More fleshed out version of aggregated user path idea. A score was added to determine the level of struggle. 

Storyboard for what eventually became the final concept.   

Storyboard for what eventually became the final concept. 

 

Wireframes

Concept 1: Alerting Flow

How this solved the problem:

Instead of digging for customer struggle patterns, the Tealeaf user would be notified of aggregated struggles. The struggles are linked to a customer defined KPI. The algorithm would scan customer sessions and aggregate abnormalities, learning in the process. 

Concept 2: Event Manager Flow

How this solved the problem:

Similar to the alerting flow, the Tealeaf user would define a KPI. The algorithm would then aggregate customer patterns into cohorts and rank them based on number of customers affected and struggle. Insights were easily accessible and actionable. 

Concept 3: Comparative Replay

How this solved the problem:

Instead of replaying a single session then drilling back out to see if there were similar sessions, the Tealeaf user could compare the session to the rest directly in replay. Here the user can see if the struggle session is having a significant impact to their business and gain a better understanding of how users user their website or mobile app. 

Final High Fidelity Mocks

How this solved the problem:

After talking to many customers and consulting the needs of the data science team, I connected the algorithm's need for context and a user's ability to add that context. If a user would label patterns as success or failure, the algorithm could learn more efficiently. Once the user labeled a customer pattern, the algorithm could identify similar patterns as successes and failures. 

Instead of introducing an entirely new dashboard or mode to the software, I integrated struggle detection into session search. Session search was clean and easy for novice customer to understand. By aggregating sessions through search, the drill up, drill down, guess and check pattern was removed. Insights became easily accessible and directly actionable.