Tealeaf and SiteCatalyst Integration Response

integrationI recently worked with Adam Greco to do a post on integrating Tealeaf and Sitecatalyst. You should check it out when you get the chance, Adam is a very informed individual and his passion for web analytics and digital marketing are in my opinion unsurpassed. I first met Adam years ago when I was “accepted” into the best practices group now the consulting group at Adobe. I say accepted because I was more of a techie than an analyst at the time (they let me in because I knew the data). And I enjoyed working with Adam, Brent Dykes, Nathan Frodsham, Brian Jenkins and others. I always liked the business side of digital marketing, measurement and optimization.

Now Adam’s post does an exceptional job at stating how the integration works and not selling Tealeaf as a product. (I know Adam, you’re not there for the Vendors, you’re there for the clients!). Having come from the web analytics world with Adam, I wanted to give my two cents beyond the Sitecatlyst integration of how the two tools can work together. Well, how web analytics and Tealeaf as a product can work together.

Adam already talked about how web analytics excels at slicing and dicing data. It is a great way to find issues/opportunities. But, when those issues/opportunities are found, often it is difficult to say why it happened. Borrowing from Brent Dykes here, it is like playing Clue. You know that Professor Plum did it in the library with the candlestick, but you don’t know why. You need the full story, you need more data. This is where quantitative leans heavily on qualitative to get the story. Sometimes a survey will point you in the right direction or your customer support team will fill in the story. This is where having more data helps. And, the thing that really attracted me to Tealeaf was not the replay functionality, it was the huge amounts of data storage they have tackled for accurate replay. With Tealeaf you can collect everything. Every server call, every internal API call, every external API call, every UI interaction, it tells the whole story because you have everything that happened to that customer laid out before you. The replay is great to give you some quick gut understandings to what happened, but being able to dive into a deep ocean of data at the individual level tells the entire story. It’s like having the novel to the game of clue right at your fingertips. Yes, it takes some digging to find the issues, but it’s easy to become adept at pulling out what happened from those issues. Point is, having massive amounts of data like that at the individual level can tell you the whole story.

Adam also mentions that previous to version 8, slicing and dicing of data was not as powerful as using a web analytics tool, and that is certainly true. I was very lucky to come in to Tealeaf during the launch of version 8. I LOVE version 8. It does all the breakdowns and eventing like traditional web analytics companies and the dimensioning has been built in really well to all the reporting. So, slicing and dicing of data to find issues/opportunities can be done directly inside the Tealeaf UI and eventing engine.  So, the example of tracking segments of users who abandon shopping carts for X reason can be easily tracked and reported in Version 8, with the option to replay individual sessions. The beauty of having all the data is what would take weeks, sometimes months to pull out, often can be found replaying a session and diving into the code for that session. We are talking 20 minutes vs. weeks or months simply because you have access to all the data.

Now, I hate using the word “replay”, to me it is more of a “deep dive”. You find the issue and you dive into what the cause of the issue was. That can be looking at what the user saw in the UI, his/her UI interactions, what was available in the request/response, what happened with the internal/external APIS,  and what happened on previous visits. When the issue is found, you don’t have to wait for your development team to code up the issue for measurement, you simply code events based on the data gathered and you soon know the extent of the problem, Or if you have CxConnect you can run a job to know how it affected your past visits, but more on that next.

During my web analytics years I was a heavy user of Data Warehouse, it was basically NOSQL or a flat file of clickstream data collected for analytics. I can’t tell you how many times we had to solve issues or dig into an analyses using the data warehouse. Now, the thing that really blew my mind when I heard of Tealeaf was the storage they do on sessions. Full sessions data stored for days, months and sometimes years. That means all that pure data was available at anytime to be pulled and used for past analyses when eventing/dimensioning missed something. This is CxConnect. You have this storage of ALL data that is used to create the replayable sessions, but if you need to pull out data from those sessions, you can do it. Using the Data Warehouse with web analytics analysis, around 50% of the time we would have to tell the Client it could not be done without making a change to their implementation. The beauty of CxConnect is at anytime you can pull out data that was lost to your web analytics tool. It is seriously amazing to me. That means telling the client it can be done maybe 95% of the time. It’s as if you were able to open a time portal and go back weeks, months, or maybe years to tell your developer to code this one thing for your web analytics tool. Now, how can you use this functionality with your present web analytics tool? Simple, pull out the data using CxConnect and insert into your web analytics environment. This will give you access to past data and allow for side-by-side reporting.

The other thing that I LOVE about Tealeaf is that I can finally control DATA QUALITY. That’s right, there have been numerous articles in the digital analytics space about finding your balance between data quality and analysis. If you spend too much time on Data Quality you sacrifice time you could have spent on analysis. Data quality was always a pet peeve of mine. I would be helping out an analyst to understand the click stream data and had to explain why things were being collected in such a manner or discover an anomaly in the client’s implementation that ruined the whole analysis. Now, during my analyses with Tealeaf, when I find pages that are not coded correctly, I simply change my event/dimension, document it and my data is a little cleaner. Tealeaf does well with scrubbing process flows. By searching for unexpected process orders, you can quickly see how the events should be recoded by “replaying” (deep diving) into a session. Often there are sub/side processes that use parts of other processes and often need to be tracked separately or aggregated into the main process or both. Now, if you were able to have your Tealeaf eventing/dimensioning match or mirror your web analytics implementation you could find the data measurement issues and slate for updates so your overall analytics outside of Tealeaf is more accurate.

Adam also touched on using data collection from Tealeaf and sending to the web analytics tools. We have had many clients interested in doing this, simply because it is easy to create events/dimensions inside Tealeaf. If this data could be shipped to a web analytics vendor, then the data collection could be changed on a dime, because Tealeaf builds events/dimensions based on all the data collected. But as mentioned by Adam, the API token costs of doing this make it a bit impractical. Not to mention it may go against the strategy of a web analytics vendor to be the center of data collection. Many vendors are now leaning on their own proprietary Tag Management Systems  or encourage Clients to use Ensighten, Tagman or Tealium. This often shortens the time to implement, but may still require developer coding and may add more JavaScript weight to your page. I hope to see the token costs drop to nearly free for data collection purposes. Why does it matter if the hit originated from the browser or if the data came from the network. You’ll get more data because browsers aren’t blocking the request to the client’s data center.

Another point of integration is combining IT data with clickstream data. Tealeaf monitors many things on the server side that a web analytics tool would not. Namely the time it takes a server to generate a page, network times, ack times, etc. This data is extremely useful when you are trying to understand why conversion rates may have dropped, if not to show it was server performance then to rule out server performance. In a previous post I stated how I used to work for a web analytics company and I was on a call with a client frantically trying to figure out why a campaign was performing so poorly from a conversion standpoint. From the campaign management perspective it was incredibly successful, a large click-through to impression rate. Turns out the servers could not handle the “success” of the campaign. This turned into a wasted campaign budget and a bad user experience. With Tealeaf IT data can be aggregated across pages, campaigns, or any other sub relation to create further reporting inside SiteCatalyst that points out server side issues that could affect conversion. This could also include 400 level and 500 level status code pages. By taking the aggregations on a predefined time basis (10 mins, 30 mins, or maybe hourly), this data could be uploaded to SiteCatalyst with minimal API token costs.

Finally Adam mentioned a couple of competitors in Replay. And, yes they are competitors in replay, but not in deep diving. The pages that are constructed are not exactly what the user saw. It is a simple way to give an idea on how they navigated the site. UI navigation, API calls and the full request/response that the user sent/received are not available. So, yes this may give you some rudimentary understanding of the user experience, but can’t give you accurate view of what actually happened to the visitor.

Tealeaf is not a simple product, but it is chock full of all sorts of goodies that excite me everyday working with the set of tools available. I hope you can use Tealeaf as a companion metric gatherer to your web analytics tool, as a deep dive into web analytics segments, as a data quality tuner, as an IT data gatherer, and as a way to pull missed data.

Please feel free to ping me about any questions you may have on my twitter account @solanalytics.

Add a Comment

Your email address will not be published. Required fields are marked *

Read previous post:
target
Improve Campaign Success by Leveraging User Experience: Part 2

This is a post I wrote available HERE. I am posting on this site to make it more widely available....

Close