Every 2 weeks, we share a selection of articles on how you can leverage data to strategically grow your digital product.

Edition #34 – June 15, 2020
Originally sent via Mailchimp

Good morning product analytics friends 👋

It’s Summer and I want to stir this newsletter up a little for the next few months. That essentially means I’ll be sending new editions irregularly and will be publishing only 3 stories per edition 🍹. That might seem lazy (and you may be on to something), but I like to think that is is actually leading to something I’m working on that should be available in the Fall.

With that, on with the 34th edition of the Product Analytics newsletter!

Olivier

@olivierdupuis


Introducing The RA Warehouse Framework

RittmanAnalytics.com
by @markrittman

Adopting a modular approach to analytics has many advantages. The most common one is that you can interchange tools and not be locked in a corporate walled garden. And there is also how it allows extensibility by a community of contributors.

To make all those modules work together, you need a common language, a common set of principles, design patterns, etc. That foundational layer allows for creative modules to be built upon them and to be adopted by others that have the same foundations. And this is where Rittman Analytics’ data warehouse framework fits in.

The core idea is that you should be able to design a data warehouse efficiently and quickly by staging multiple sources and integrating them together towards common purpose warehouses (financials, product, marketing, etc.). Since we all mostly rely on the same SaaS-based sources, you could have recipes to integrate them together and “extract” dimensional models out of them. For example, you could be loading Facebook Ads and Google Ads sources, integrate them and build a marketing warehouse out of it.

This project is led by Mark Rittman and he is building on his lengthy experience in the analytics space to provide a framework that codifies right from the start all the best design principles, modeling patterns and quality controls to build a modern data warehouse. It’s essentially a cookbook of recipes learned from a rich set of experiences.

I’ve had the chance to work with this framework in the past 2 weeks and I really like where this is going. The core idea for this framework and its execution are very solid imo and I’m excited to see how it will evolve and benefit all projects that have been forked from it in the future. You should definitely read its associated blog post to dig into the technical details of its implementation.


Augmented Analytics Is The Future of Analytics

Gartner.com
by Gartner

According to Gartner, “by 2021, 50 of analytical queries will be generated via search, NLP or voice, or will be automatically generated.” Their point is that organizations will rely less on data scientists to dig into the data and produce insights.

I think there’s a kernel of truth in that, but to me that statement takes a shortcut by assuming that data is already disposable in clean and well structured format just ready for it to be analyzed or that you could just use AI to extract insights directly out of a data lake. I don’t think that is the case. Organizations have multiple sources of data that doesn’t necessarily integrate well together and there’s still some engineering that needs to happen in those early stages to get to structured dimensional models that can then be queried intelligently.

So AI could very well drive augmentation of the data, but also improve the transformation of raw into usable data. It might allow us to interface more easily with data and be used to improve the quality of our data, as well as enrich it by adding new dimensions and layers of meaning.

For example, there was a Software Engineering podcast episode with the creator of Holo Clean which “leverages available quality rules, value correlations, reference data, and multiple other signals to build a probabilistic model that accurately captures the data generation process, and uses the model in a variety of data curation tasks.” Essentially, it’s Machine Learning to improve the quality of your data.

Or take this blog post on in place machine learning in Snowflake. I have not experimented with this, but the idea is that we could start applying predictive models directly within your data warehouse, without going through an external service. That is a promising avenue to start adding an ai-driven improvement and augmentation chain to your data warehouse.


Hey Kid, Take A Walk Into Qualitative Land

Sharpen.Page
by @pascallaliberte

I once shared a blog post I wrote about extracting funnel paths in your data warehouse using dbt. The idea is that you should be able to associate multiple paths to a single funnel and have them directly in your dw to be analysed in the tool of your choice.

But going beyond single-path funnels requires good knowledge of your users and the multiple paths they might take. Funnels and paths are an abstract of your user’s behaviours and only relying on quantitative data to map and analyse them might lack nuances. That’s why we think taking a detour into qualitative data is as important as capturing and analysing high-quality behavioural data about your users.

When Funnels only Explain a Minority of Purchase Scenarios” is an article written by Pascal Laliberté where he takes you on that journey in qualitative land to give context and refine your understanding of user’s behaviours. Pascal’s structures his approach with the Job To Be Done framework, which essentially associates behaviours to a job that a customer wants to do, and they a hire a product to help them fill it. So a person doesn’t buy a hammer for the sake of owning one, but to build a treehouse.

Purchase stories will help you see how each one of the steps in your funnel were answering different jobs-to-be-done, which might explain why your funnel didn’t really flow.

What that means is that each step of a funnel is hired for a job. If those jobs are distinct and not linked to one another, you’re missing a clear narrative that leads down that funnel’s path. So you might want to optimize the whole funnel but you’re essentially working with a multitude of distinct jobs that don’t flow from one to another.

Armed with that knowledge, the qualitative journey leads you back to refining existing paths and defining new ones in your funnel. That’s definitely the loop towards having relevant and insightful funnels for your analytics.

Every 2 weeks, we share a selection of articles on how you can leverage data to strategically grow your digital product.