What Product Managers Can Learn From Digital Analytics

The first phase of enterprise SaaS was taking all the stuff that used to run on-premises, putting it in the cloud and changing up the billing/business model. This has unlocked a tremendous amount of flexibility, innovation and improvement in enterprise software, ranging from design to security to cost-effectiveness. Enterprise SaaS product managers have long used product data and analytics to improve those products, but it’s often been surprisingly difficult to do so. I should know. Getting usage analytics on pretty much all of the cloud products I’ve ever managed has been way harder than it should be.

But that’s changing, and there’s a whole new crop of PMs out there using new tools that make it easier than ever to both get product analytic data and to put this user feedback to direct use in honing their products. In fact, I increasingly see similarities between where product managers are today and where marketers were about ten years ago. Marketing’s adoption of digital analytics tools holds a ton of important lessons for enterprise SaaS product managers today to learn from, particularly in terms of how to consume and properly derive insight from those tools and make their outputs actionable.

In this blog post, I decided to call in some outside expertise who’s participated in that evolution in marketing firsthand. I asked Nancy Koons, who is something of a celebrity in the digital analytics world, to talk about some of the lessons she’s learned applying analytics to marketing, which I find highly transferable to product. We traded emails on this for a while, and I’ve distilled our exchange in the points below.

Nancy Koons started her career in digital analytics over a decade ago. In her first client side analyst role she guided a growing, dynamic company through the implementation of Adobe Analytics (then known as Omniture), and worked with everyone from the C-suite, Marketing, IT, Revenue Management, CRM and Social teams to understand customer behavior, site performance and recommend ways to improve their marketing spend. Nancy now works with companies of all sizes and in all phases of analytics adoption, helping them grow their teams, their analytics competencies, and utilize their digital data.

For my part, I’ve worked on digital and marketing analytics platforms for a long while now, and have gotten to witness the growing pains of the marketing function’s adoption (and resistance to) analytics. Adopting digital analytics tools, like Google Analytics or Adobe Analytics or Coremetrics (😄), is pretty much table stakes for any serious marketer today. We forget that it was not always thus. Marketing went through this whole period where analytical marketers had to constantly prove/defend the ROI of such tools to skeptical old-timers. Obviously, as ecommerce exploded and the internet became, you know, a thing, that conversation ended fairly quickly, and companies that were behind the curve paid a dear price.

So brands and companies bought digital analytics tools, but then had to learn to use and adopt them, which changed an awful lot about how marketing works. Anyone can pop open a Google Analytics dashboard – but so what? What do you actually do with that? Companies of all sorts still wrestle with how to best use their digital analytics tools, and this has given rise to a sprawling and intensely sought-after industry of digital analytics consultants (like Nancy) who help companies make sense of their marketing data.

Getting insight is harder than it looks

The hard truth is that getting insight on actual user behavior (whether on a website or in a SaaS product) is harder than many people assume. Lots of PMs and developers have prior experience doing basic analysis with existing datasets – but not necessarily in setting up a tracking strategy and actually collecting accurate data. There’s often this assumption that data collection/tracking is simply a matter of sticking some code on your web page/app. That will typically get you some data, but it’s close to meaningless without first being clear about what you’re trying to measure and, importantly, why. To make sure your data is comprehensive, not to mention valid, you have to be very intentional about what it is you mean to track.

Nancy: One of the big things I’ve learned is that educating everyone else around me at my company was a big part of my role, and no one (including myself) really anticipated that. The biggest items I had to continually preach: Implementation is not a “set it and you’re done” exercise. Any and all change introduced to a website or app can disrupt or break tracking. It’s so much more fragile than anyone ever thinks. The fact that QA had to happen with every release – and simply to try to preserve the tracking and data collection we had in place. And, focusing on what was truly important to measure, and making sure our data collection was rock solid.

Deciding what to measure isn’t obvious. Are you just measuring user pathways? Specific KPIs that either represent value to you (the company) or your user on your application or website? How have you approached that problem?

Nancy: On websites and apps there’s usually a bunch of stuff – and [the analyst’s job is] mainly sorting through which things are the most important, and which things are less important (even though that one marketing coordinator really wants that one number). We in the industry still debate whether to try to track ‘everything’ on a site, or just the things someone outside of the analytics team has identified as being important. The upside to tracking everything is having the data already collected when someone asks a question. The downside is it’s very time intensive and the team can spend way too much time setting up data collection on things no one will ever look at or care about. So it’s about finding that balance – which can vary from team to team and company to company.

If a client cannot articulate their current KPI’s (both new and large orgs sometimes struggle with this), I focus on the following questions to tease out priority: If you had to pick 2-3 actions you want customers to take on this (page or site), what are they? How do you plan to use the information we are planning to collect? If a marketer can’t explain how the data collected will lead to action, it probably isn’t critical.

When it comes to interpreting the data in an analytics tool, an early lesson I learned: If I couldn’t explain what a metric meant, or what it represented, my stakeholders lost all confidence in the data. So I had to really figure out what the data in the tool did and did not represent when I shared analysis and talked through reports.

Conducting proper analysis

I see a big issue with how we analyze behavioral data. Without clear metric definitions, segmentation and breaking out discrete groups (like different companies who are using our product, or user personas/functions), all you have are averages, which aren’t very useful for drawing insights about what users really want. This is especially true in product analytics, where user personas can also be very different across industries. Specifically, a product manager analyzing user behavior needs to be very careful to collect a statistically significant sample of behavior from a representative user group before inferring any conclusions. I feel like a lot of casual users of analytics tools are quick to over-infer or generalize from non-representative data.

Nancy: This is also about finding the proper comparison: In Marketing, is it most useful to compare user behavior Year over Year, or Week over Week? (The answer is, it depends on the business.) Do you need to think about seasonality in terms of usage of a site/app/product by day of week, month of year? In Marketing, you are focused on the end customer, and your business, and finding the right lens. For product analytics, I’d imagine it would be somewhat similar – e.g. don’t make decisions about product usage based on the Sunday morning average when everyone uses the product M-F from 8-5. Drilling into your data to find the right “amount” of data always depends on the amount of data you have. I frequently encounter Marketers and Product Managers who make an update to a portion of the site, and they want to know THE NEXT DAY if they’ve made an impact. Guiding them through the typical number of hours, days or weeks of data needed to determine that is a constant exercise.

This gets into another big question – who does the analyzing, and how does a Product team incorporate analytic usage data like this into product development cycles? I think a lot of Product teams could learn from the best digital marketing teams on this. How do high-performance marketing teams consume analytics feedback into their regular workflow?

Nancy: This always depends on the mix and makeup of the teams, but what I did in my role at Vail Resorts was train and educate each team on which data they should use to get them comfortable and adept at (correctly) analyzing the data relevant to their role. So, the e-commerce team understood how to pull conversion and revenue reports, as well as what they could and couldn’t correlate it to in our dataset. Same thing for CRM, Social and Media teams – the analytics team pointed and guided them in understanding how to pull and analyze the information they needed on a day-to-day basis. For more in-depth analysis, the analytics team would then dive in and try to assess and answer questions. Whether it was to help retain some objectivity, to help an end-user who lacked analysis skills, or to help verify and validate that the marketer had pulled the right information – I see analytics specialists continuing to play the role of Data Subject Matter Expert. While there are many vendors out there who try to say their tool will provide “anyone” with the ability to analyze data, it’s been my experience in digital analytics that it’s still tough for everyone to understand the nuances of digital data collection well enough when their day-to-day job involves doing something entirely differently.

Looking for patterns that aren’t there

You bring up a really important point there about objectivity. There’s that famous joke about how a bunch of marketers each responsible for a different channel roll up into a meeting, and each claim X dollars of sales from their own channel, and the sum total across them all is 10x the actual sales that came in. How can product managers, who are analyzing behavioral data from their own products, guard against “seeing what they want to see,” so to speak?

Nancy: Digital analysts run into another similar issue when looking at behavioral data. Example: Digital analytics tools attempt to measure much time visitors spend on a website or web page. (I say ‘attempt’ as there are inherent challenges in accurately measuring this and it’s a flawed metric, but that’s a topic for another post). There is still the question: even if you had a perfect way to measure time on site, is a longer time on site a good or bad thing? A content marketer may make the assumption that users spending more time on site = more content consumed. An e-commerce marketer might be concerned that the purchase process takes too long. This is where I would guide a content marketer to focus instead on the number of articles or videos viewed, and to use a better metric than simply the time a user has their browser open.

For the e-commerce manager who is worried there are too many steps, or too much friction in the checkout process, I would recommend building out a funnel to understand where users drop off, and combine that data with a heuristic review of the process on the site to identify opportunities for testing or improvements. In Product analytics, I imagine it could be difficult to understand whether someone navigating through certain paths or processes is a good or bad thing. What if the only way to use the Product is to jump through a bunch of unnecessary hoops? You see customers trained up on how to use your product, and able to complete the process, but it may be that they are really frustrated – they just can’t tell you via their actions, or only by watching what they do inside the tool.

What advice would you give to product managers who are starting to use analytics as a source of customer feedback? What traps or pitfalls from digital marketing’s adoption of these tools can we avoid?

Nancy: What we learned in the early days of digital analytics is that clickstream data about users actions on a website is often not enough. Digital analytics data can tell you WHAT happened, but cannot necessarily tell you WHY. I like using website surveys and usability studies to help supplement digital analytics data. Site surveys work very well in digital analytics because you can quickly build and launch a survey to collect a few pieces of information, and there are lots of very nimble options out there an analyst can run with on their own. Usability studies are often conducted by a UX or Online Experience team, and can be a great way for an analyst to listen in as people navigate a web product and provide feedback. For Product Analytics, I imagine user surveys, customer surveys, or customer forums would be a great place to pose questions and collect the qualitative feedback you cannot glean from your quantitative data sources.

There really is a ton more to talk about on this topic, but this blog is getting long. Stepping back, product managers today have an awful lot to learn from how the digital marketing field adapted to the rise of digital analytics tools. It wasn’t just about learning how to interpret feedback. It was also about learning how to quickly adapt content and online experiences to match audience reactions, and rigorously interrogate what was working, versus what didn’t. Most importantly, learning how to change how marketing itself was done by using analytics as a guide, and evolving their organizations, roles and responsibilities appropriately, was a very difficult lesson for many companies to learn. Those of us in product should learn from their experience.

I want to give a shout out to the good folks at Pendo here for talking through some of these issues with me as well. In addition to being headquartered in Raleigh, which I love, their CEO, Todd Olson, generously agreed to write the foreword in our upcoming book on product management in enterprise software, which is due out in just a few weeks! Pendo is doing arguably the coolest stuff out there right now in product analytics – go check them out.

 

Related posts:

 


[mc4wp_form id=”185″]

Standard