Adobe Summit 2018: Tips from an Adobe Analytics Rockstar – Info Analytic

“From the dawn of time* we came; moving silently down through the centuries**, living many secret lives, struggling to reach the time of the Gathering***; when the few who remain will battle to the last. No one has ever known we were among you… until now.”

* Since 2015
** The past 3 years
*** Summit

There are hundreds of breakout sessions at the Adobe Analytics Summit, but one of my favorites has always been the Analytics Idol/Rockstar session.

Real practitioners share real Adobe Analytics tricks and from real use cases, so there are always some excellent tactical ideas to take back to your office and use immediately.

It’s designed like a game show with prizes and a winner selected by the studio audience, so it’s a little more fun than your average session. For the past few years, Blast has had a continued presence onstage at this session: Joe Christopher presented in 2015, Brad Millett in 2017, and this year I, Halee Kotara, became the latest in the Blast/Highlander lineage to participate.

If you’d like to learn some tricks on how to use Adobe Analysis Workspace to help you get through QA faster, or to set dynamic targets based on past performance, watch the video of the session, or read the recap of the two tips I shared below.

Tip #1: Using Adobe Analytics Analysis Workspace to Streamline the QA Process

icon representing using analysis workspace to streamline the qa process

Doing quality assurance testing on analytics tagging is tedious, manual, time-consuming, and soul-crushing. It’s also incredibly important and can’t be skipped, so I’m always trying to find ways to be more efficient.

We often rely on our tag management tools to do most of the heavy lifting, which means the data we’re capturing is usually fully-formed when it comes through the debugger. In those cases we can get by with only checking the values fire real-time and just spot-checking the reports. But if you use Adobe’s context data variables and processing rules (and we often do on mobile apps), we can’t skimp on the second report-checking step of QA. A lot can go wrong between the data firing in the debugger and the data appearing in the reports when using processing rules!

I’ve found that it’s extremely helpful to create an Analysis Workspace project purely for QA, and it saves me many hours per round of testing. Here’s the process I follow that helps me get through QA faster so I can get back to analyzing!

Step 1: Create a QA Spreadsheet

This sheet should contain columns for:

  1. A list of all the potential context data variables in the entire implementation
  2. The associated props, eVars, or standard dimensions that each context data variable is mapped to
  3. For each screen/page/custom link:
    1. The expected values for that server call
    2. The actual values that come through in the debugger (plus a pass/fail column and explicit instructions on what needs to be corrected)
    3. The actual values that come through in the report (plus a similar pass/fail column)

screenshot representing the creation of qa spreadsheet

Step 2: Create an Analysis Workspace project 

The Workspace will essentially reflect your entire implementation, and yes, it might contain hundreds of Freeform Tables. As far as metrics, it’s a good idea to always pull in occurrences or eVar instances. And depending on the variable, you’ll want to pull in any relevant custom events to validate, too. (For example, in the site search keyword eVar report, pull in the custom events for searches and null searches so you can validate those events are firing when and where they should.)

screenshot of adobe analytics analysis workspace

 Step 3: Organize the Workspace

In each Freeform Table:

  1. Add a title with the name of the associated context data layer variable(s)
  2. Arrange all the Freeform Tables in alphabetical order by context data layer variable name

screenshot of organizing the workspace

Step 4: Generate and Log Test Data

Go through all the pages or screens you’re validating and log the values you see in the debugger in your QA spreadsheet.

Step 5: Apply a segment to the Workspace project

It’s time to validate the values in the reports, but first narrow down the data to a specific server call in order to validate that only the relevant values were set on that pageview or custom link. The segment should be hit-scoped and contain attributes to filter to the specific data you generated in a single server call, such as your user ID plus the screenname of the screen you’re checking.

screenshot of applying a segment to the workspace project

Step 6: Use Workspace to Validate Reporting Values

You now have a QA spreadsheet and an Analysis Workspace project that are both sorted alphabetically by context data variable name. This makes it very easy to scroll down in tandem and quickly log and validate the report values.

Bonus! The Workspace becomes your checklist. One of the things that is easiest to overlook in QA is when a value isn’t set at all. But when you see that big empty Freeform Table staring back at you, you’re far less likely to miss those errors.

screenshot showing how to use the workspace to validate reporting values

Takeaways

  • QA sucks and this will help you get through it faster.
  • It will also help you get through it more thoroughly and catch more mistakes.
  • If all your apps and sites are implemented consistently, you can recycle this Analysis Workspace for all your QA efforts.
  • You can share this Workspace with your whole team, ensuring everyone is following the same process for QA.

Tip #2: Setting and Measuring Targets Based on Historical Data

icon representing the setting and measuring of targets based on historical data

Remember these? (Shudder.)

The old targets in Reports & Analytics left a lot to be desired. They were entirely manual to manage, and if you wanted daily targets for a year, then you had the pleasure of entering 365 individual values. It’s no wonder these aren’t even available in Analysis Workspace!

screenshot representing adding targets

Analysis Workspace makes it very easy to compare to a prior time frame, but what if you want to compare to a goal, not actuals?

Not only can you have such target goals in Workspace, you can also base them on past performance and effectively build a simple predictive model, like 10% increase over last year. All you need is a combination of custom date ranges, time-bound segments, and calculated metrics.

In this example, we’ll create a month-to-date target for the visits metric that is a 10% increase over month-to-date visits from last year.

Step 1: Create a custom date range for “Month to Date (This Year)”

This custom date range should have a start date of “Start of Current Month / rolling monthly”, and an end date of “Start of Current Day / rolling daily.” This will create a dynamically updating / rolling date range that begins on the first day of the month through end of day yesterday.

screenshot of calendar representing month to date (this year)

Caveat: If you’re using anything other than the default Gregorian Calendar in Adobe such as the 4-5-4 Retail Calendar, this won’t work seamlessly and will require adjustments every month.

Step 2: Create a custom date range for “Month to Date (Last Year)”

This is essentially identical to the custom date range above, the only difference is that the start and end dates have the additional configuration of “minus 365 days” to pull the same dates from the prior year.

screenshot of calendar representing month to date (last year)

Step 3: Create two time-bound segments

Since you can’t use a custom date range directly in a calculated metric, it’s necessary to create a couple of segments that simply contain those date ranges.

screenshot representing two time bound segments

Step 4: Create two calculated metrics

The first calculated metric will give us our target number. We’ll call it “Visits Target:
Month to Date 10% Increase YoY” and the formula is simply the visits metric nested inside the “Month to Date (Last Year)” segment, with a static number multiplier. In this case our multiplier is 1.1 since we want a 10% increase.

screenshot representing the static metric

Note that the “Static Number” option is hiding under the “+Add” option in the top right. If you look for it in the left-hand component rail, you won’t find it.

The second calculated metric will show comparison to goal. We’ll call it “Visits Target: Over / Under YoY” and it contains a simple difference calculation between this year actuals and last year actuals x 1.1:

screenshot representing the metric comparison to goal

Step 5: Pull the new metrics into Freeform Table

Since you’re comparing this year and last year, be sure to use the “Day of Month” dimension rather than “Date” which has a reference to year and will cause this year’s column to show zeroes. For the over/under metric, consider using the conditional formatting option to highlight when variances were the greatest.

screenshot representing a freeform table

Step 6: Create visualizations

Let Analysis Workspace do what it does best… visualize!

screenshot representing the results of using adobe workspace

screenshot of a bar graph that represents the number of visits in different time periods

Looks like Adobe had a similar idea, because in the “Too Hot for Main Stage” session at Summit where they showcase potential new features and let the crowd vote on their favorites, one of the ideas was for a “Forecasting the Future” feature. It would natively do what we’ve just set up here, plus an “analyze” button that would allow you to see the top contributing factors that would lead to you actually reaching your target.

Fingers crossed they actually build that out, but in the meantime you can do it yourself!

Takeaways:

  • This is a scalable way to set your targets, they’ll constantly update themselves
  • You can easily adjust this for different time frames or metrics

Conclusion

In the end, there can be only one. And sadly, I was not it this year. While I did not win the coveted title of Adobe Analytics Rockstar, I was very impressed with the tips shared by my competitors, and so proud to represent Blast!

picture of halee kotara with her prize after competing in adobe analytics rockstar competition

 

Article Prepared by Ollala Corp

You might also like
Leave A Reply

Your email address will not be published.