When I saw that the Digital Analytics Association was doing a symposium on “Creating Great Customer Experiences with Analytics” I had to go. Having worked at Omniture for many years it was there that I discovered Tealeaf through a client partnership. I was enamored with the data-power and data-depth that Tealeaf provides. And, ever since I joined Tealeaf – over 3 years ago, I have been applying analytics to the customer experience, and it has been really fun! With the version 8 release of Tealeaf– the quantitative data collection has been amazing and it’s getting more exciting with every new release (9 is going to knock my socks off).
In the symposium, I really did enjoy the broad view of looking at the customer experience with multi-channel analyses. And I really enjoyed the quantitative and survey-qualitative approach that was displayed by the Apollo Group. And, the Bing A/B testing was a great view of testing and measurement (never trust your intuition), some great insights.
Though the DAA symposium in Seattle on using analytics to create a great customer experience was great, I felt it lacked a lot of the analytics that Tealeaf has come to do on a daily basis to identify struggle and improve the experience. Although most of these metrics are easily gathered and acted upon in Tealeaf, for those that do not use Tealeaf, they could prove to be a pointer in the right direction.
One of the main things we focus on when looking at the customer experience is struggle. Normally we focus in on a specific process like checkout, login, registration, etc. and really dive in to how the customer is experiencing the process. Tealeaf is a deep-dive tool, collecting an ocean of data (we often compare web analytics to a fun water park of data). But in order to deep-dive we need to know where to dive. That is of course where quantitative meets qualitative and vice versa. We need to find those needle sessions in the haystack of data. This is where tracking struggle comes in–
Restarting the process over and over.
One of the main things we track is users who start the process over and over again. We love to dive into those sessions where users started the registration process three or more times. What happened? What were they doing? Did they complete? Often we find things in the replay that we can quickly code up as a new event or using a text search see how it affects past customers. This is a simple way to find struggle. It may be under utilized because companies may lack the tools to see the actual session with the struggle.
Time to complete process.
Another easy one is looking at those who do complete the process and how long they took to complete. By finding the median time and the standard deviation for the time to complete you can look at those who took more than 3 standard deviations to get through the process. Diving into those sessions you get a sense for the outliers who took a long time. Where did they stop? Or, after starting the process, where did they go? Pulling in the path for these users when they do finally complete is also interesting.
Server generation and network performance in the process.
Because Tealeaf sits in the data center, measuring server generation times and network times is easy. And it is a great glimpse into the customer experience. I want to know how many sessions experience a page that takes >5 Seconds to generate on their server. Sometimes I will exclude the final page in the process if it is built to allow for waiting (like the final page in the checkout with a processing image ). Often we will measure the actual API calls both internal/external to see what the real culprit is. Also, although most clients have a really fast network to push the page out, I have been with some clients where we often see pages taking more than 2 seconds to go out to the customer. This would definitely count as struggle. I know if I ever have to wait more than 2 seconds, I get a little fidgety. Doing replay on these, we see users clicking the back button and trying a page over and over, and often giving up. That’s where some small changes to either help the user wait or push the page out quicker can make a big difference. I love that Bing did an A/B test by artificially inflating the server generation time to see how it affected revenue , their findings– 10 milliseconds = roughly 250k/year. Amazing…
Error Message repeats in the process.
One of the main things we track beyond the process itself is error messages. And we group the error messages into one of 5 groups– System Error, Application Error, Business Rule Error, User Error and General Message. So beyond know that the user saw a message in the process, we want to know if they had repeats of those errors. And, we look at each error group separately. If the user is seeing several system errors in their process, their is a system struggle. If they see several user errors in the process, it is a user struggle, etc. General messages aren’t counted toward struggle. Once again diving into the session with the struggle gives further insight.
Path from process to contact us/help/FAQ pages.
I know that in web analytics tracking when user move from a process to a help page is a normal red flag. If not already doing this, it’s easy to set up. The added benefit once again is the replay in Tealeaf. And, if you are pushing customer support data realtime to Tealeaf, you can also get support questions attached to the same session. That is powerful.
This is probably the main thing done in web analytics to identify struggle. Why are users not moving forward in the process. If not looking at process-step interaction it can be added in web analytics. The only addition that I add is a step between each process step– process-step attempt. Simply looking at those who request the next page, but get delivered something different. You will be surprised by how often I have seen users redirected to the home page even though they should have gone forward in the process (this would look like navigation to the home page without coding the attempt). Looking at submit button clicks, can also get you close to this, but I tend to trust the actual request to the data center to identify an attempt.
Large dwell times in the process.
Another thing that is often done in web analytics is looking at large dwell times attached to a page in the process. We look at the same with Tealeaf’s quantitative tools, it can point to a troubled page/session to dive into further.
Large form field dwell times in the process.
Often when a user is stuck on a page it is often a form. Looking at the individual form fields and their dwell times is super valuable. Where are they struggling to move forward? Can we rearrange the form fields or take that form field data out to a separate process?
Slow browser render times in the process.
Once again, should already be tracking. Which browser are rendering the page slowly in the process. Can we fix the render time or build a quicker page for certain browsers?
I loved the Apollo group’s presentation on using survey data. Those are insights that are EXTREMELY valuable. And, Tealeaf is another tool to connect those dots. If you ever get survey responses that don’t make much sense, often pulling up the session in replay will give you insights that you would not be able to get any other way. When you hear what they are complaining about and then see it, it gets to the point pretty quickly…
Customer service deep-dives.
As an analyst interested in the customer experience, ensure you have regular meetings with customer service. They will have insights about the web site that you will not get anywhere else. Remember, if you treat customer service like they are the customer, then you will find some great things to improve the customer experience– and save on your call-center costs!
Now that we are tracking some of the struggles through the process, ensure you have built the proper KPIs to track them. Each should only be tracked once per session or use visit metrics.
General Process Struggle: #Process Starts to Any Struggle/ #Process Starts
Process Restart Struggle: #Process Starts with restart struggle/ #Process Starts
Process Long Completion Time Struggle: #Process Starts with Long Completion Time/ #Process Starts
Process Performance Struggle: #Process Starts to process page with large generation time or network time / #Process Starts
Process System Error Struggle: #Process Starts to System Error repeats/ #Process Starts
Process Application Error Struggle: #Process Starts to Application Error repeats/ #Process Starts
Process Business Rule Error Struggle: #Process Starts to Business Rule Error repeats/ #Process Starts
Process User Error Struggle: #Process Starts to User Error repeats/ #Process Starts
Process to Help Struggle: #Process Starts with path to help pages/ #Process Starts
Process with Large Dwell Time Struggle: #Process Starts to process page with large dwell times/ #Process Starts
Process with Large Form Field Dwell Time Struggle: #Process Starts to process page with large form field dwell time/ #Process Starts
Process with Large Render Time Struggle: #Process Starts to process page with large render time/ #Process Starts
Process with Survey Identified Struggle: #Process Starts to survey identified struggle/ #Process Starts
Process with Customer Service Identified Struggle: #Process Starts to customer service identified struggle/ #Process Starts
Now, setting this all up, may take some time, and being proactive with survey and customer service may need to become a process. But build this out will give you an unprecedented view into your customer experience. And, having the data and replayable sessions go a LONG way in transforming your organization to be more focused on the customer. Best of Luck!
Feel free to contact me on twitter: @solanalytics.