Putting Together a Product Data Team + Automating your Data Privacy + The Pitfalls of Being Data-Driven + More (PAN #22)
Edition #22 - November 4, 2019
Good morning product owners đ
Ahhh, it is good to be back! Although I canât complain, as the 2 weeks spent in Vietnam were simply amazing. Friendly people + good food + too much to do and see + crazy conversion rate from dollars to dongs = pure fun đ
Now, I know this will sound as bragging, but as you read this, Iâll be spending the week in London, catching up with clients and partners đ Looking forward to the conversations and hopefully I will have some cool nuggets to share in the next newsletterâs edition.
With that, on with the 22nd edition of the Product Analytics newsletter!
Olivier @olivierdupuis
Â
Top Pick
Chances are that if youâre reading this, you lead or are part of a product team and understand the value of analytics to guide the development of your product. Both stories below are complimentary as the first one provides an overview of the importance of analytics for a product team, and the second one goes deep into how a data team could be structured and operate to generate the insights that guide your productâs development.
Fostering better collaboration within product teams: An interview with John Cutler
If you donât know John Cutler, I would suggest you follow him on Twitter as he often publishes high quality content on product management, teams, etc. And as a product evangelist for Amplitude, he understands the value of data in managing products.
An interesting quote from that interview:
“The amount of insight you can derive from quantitative data is finite, and qualitative data in itself can be too broad. What Iâve noticed in teams that are using quantitative data really effectively is that they are a lot more laser-focused in how they conduct qualitative research.”
Data Team Handbook
Now I acknowledge that not all product owners have access to a product analytics team, but for those who do or who are considering building one, that resource from GitLab is a must-read.
How to organise your data team, whatâs the process of a normal data analysis, how they structured data stack, other resources to help you out, etc. - itâs all in there. This handbook reflects GitLab’s dedication to building the most efficient and valuable data team out there.
Also happy to report that our modest Product Analytics newsletter is listed in their selection of newsletters with such giants as SF Data, Normcore Tech, Data Science Roundup, etc. đ
Â
Product News
The Forrester Waveâ˘: Enterprise BI Platforms (Vendor-Managed), Q3 2019
First off, I found this report through Lookerâs own tweet of it. Not sure how that works, but you can get that report freely through Lookerâs page, whereas it costs US$2495 to purchase it from Forresterâs website đ¤ˇââď¸
Secondly, I tend to be a bit wary about such reports. Their evaluation methodology is always a bit obscure and you wonder how much objectivity there is to it. That said, itâs still a good way to get a somewhat good overview of the landscape and maybe guide your own exploration of the ideal BI solution for you.
4 things pops up for me:
- How Microsoftâs Power BI is clearly leading in that report
- How Tableau is losing its edge
- The number of solutions that always seems to be increasing, but also the number of solutions that I wasnât even aware of
- How Lookerâs position and Forresterâs analysis of it is still not elucidating to me the mystery of why itâs so popular amongst BI practitioners đ
Segment and the Privacy Portal
Segment hosted a webinar to present their approach to managing your userâs privacy through their newly launched privacy portal. This might seem GDPR oriented, but I think that we should all keep a close eye on this subject to not only be compliant with regulations, but to empower your users.
How Segment manages privacy, as always, is pretty smooth. It starts by assessing the risk associated to each field. You can then enforce controls on that data, for example preventing certain fields to ever be collected from a specific source.
There are also inventories that answers the following key questions: what are the data fields Iâm capturing, from where and where do I send it to. This is your single source of truth in regards to all data that is coming in and out of your stack.
Â
Strategy
Whoâs Driving This Thing? The Pitfalls of Being âData-drivenâ
“We should make decisions informed by data. But should we be driven by data? Or by our strategy?:â
I like that emphasis on how we shouldnât be data-driven, but data informed. As weâve talked often before, especially in our guide [https://odignite.wpengine.com/ultimate-guide-to-product-analytics/], data should inform us on our strategy and how well we are achieving our goals. We shouldnât dig into data with the sole intention of finding some gold nuggets that will blow our mind.
“If you start chasing insights instead of answering questions, you could end up answering a bunch of questions that donât matter at all.â
What weâre missing in product analytics
Thatâs an interesting high-level piece for whoever is considering taking advantage of analytics to grow their product. Itâs about the different routes you can take: prescriptive out-of-the-box solutions; custom solutions provided by external companies such as Lantrns Analytics [https://odignite.wpengine.com/]; fully-internal custom solution.
“It mainly comes down to âgo prescriptiveâ or âgo customâ and thatâs a rather hard decision to make. And it turns out that eventually, you end up going custom to a certain extent and Iâve come to accept that as a fact.â
I tend to agree with that statement. Itâs normal for all product owners to start their analytics journey with Google Analytics, Amplitude, Mixpanel, etc., but as you get more sophisticated, those solutions just donât cut it anymore.
Best Practices
How Animoto uses event tracking data to understand and optimize the user journey
Even if you havenât used, arenât interested or just arenât even aware of Snowplow, this story is worth the read. Itâs essentially the story of an e-commerce company using analytics that developed 4 event-based models to better understand how users are behaving on their platform.
Best practices for data modeling
If youâve set up your data architecture correctly, you are now capturing important data from your product and you probably will want to analyse that data yourself. Welcome to the world of data modeling.
Of course, you could just plug a BI software on top of your raw data sources, such as Tableau, PowerBI, Mode, etc. and model your data through their interface. But a better approach is to have a data modeling layer in your architecture, which generates entity tables (facts and dimensions) that takes care of data transformation once and for all. This facilitates analysis by providing a clean and consistent data layer with only one set of transformation rules.
Data modeling is not a complex task, but there are best practices to it. Itâs not a field that has changed terribly, although the latest technological advancements has made some of the well-anchored âKimball rulesâ less strict then they used to be. That said, itâs good to familiarize yourself with those best practices as you want your analysis to rely on the best data there is.