Edition #28 – February 17, 2020
Originally sent via Mailchimp
Good morning product friends 👋
In our last edition, we had talked about the platform of independents, a “movement” that promotes modularity against the sector’s movement towards consolidations.
Well it seems this week is all about the other side of that coin. It’s hard to not talk about M&As, strategic partnerships and all other forms of consolidations that is happening in the data/analytics/bi space. With Salesforce’s strategic partnership with Snowflake, the completion of Looker’s integration within Google Cloud, etc. it seems like the modular approach is threatened from all sides.
Each week brings its fair share of news/hints about such movement, but at the same time there is a very vibrant and healthy ecosystem of companies and data analysts/engineers who believe in modularity and openness. Let’s hope that coexistence won’t be an issue, with the benefits of having major players (Google, Salesfore, Amazon, etc) invest in analytics while leaving room to smaller players that innovate and forge an exciting future.
And with that, on with the 28th edition of the Product Analytics newsletter!
What has been my highlight?
Snowflake announced that it raised a bunch of money and pushed its valuation to $12.4B. Why? Well, in the words of Frank Slootman, CEO of Snowflake, the idea was not necessarily to raise more capital, but to establish a partnership with Salesforce.
Now, in itself, this is another big step forward for Snowflake. But the angle of interest is how Salesforce is making another move in that space (after acquiring Tableau for $15.7B back in August 2019). It’s in the form of a participation by Salesforce but also a strategic partnership between the two. Salesforce has been a co-lead in the funding round and that seems to only be the first manifestation of that partnership. Unfortunately, we’ll only have more details about it during the Snowflake Summit in June 2020! But Slootman does give a first glimpse of what that means…
At a high level, the relationship is really about allowing Salesforce data to be easily accessed inside Snowflake. Not that it’s impossible to do that today, because there are lots of tools that will help you do that. But this relationship is about making that seamless and frictionless, which we find is really important.
The CNBC interview with Slootman is also revealing as to why such mergers and acquisitions (and partnerships) are taking place at this moment. It’s all about the cloud revolutionizing all computing infrastructures and key players want to come up on top of that wave.
Growing your product with the help of data.
As defined by Baremetrics, the “Quick Ratio of a SaaS company is the measurement of its growth efficiency. How reliable can a company grow revenue given its current churn rate?”
We’ve talked about Net Dollar Retention before and there’s an infinite number of metrics (Top 5 SaaS Metrics VCs Look At for Series A/B/C) you can obsess over to evaluate the health of your business. But the quick ratio does have the benefit of being simple to calculate and easy to explain/understand. As well as having the benefit of putting a measure on an important aspect of your growth – the ratio of new recurring revenues over lost recurring revenues.
Of course it’s not a perfect metric, not taking into account cost of acquisition for example, but it’s another one you can choose to add to the selected few that guides your growth. It helps assess the overall health of your business (growth vs churn), but shouldn’t be an operating metric as the author argues: “In the case of a business operator, however, it’s often more useful to break the ratio down into its component parts.”
Factory operations to transform data into analytics.
What a great interview with James Campbell, one of the core contributors to the Great Expectations framework. I have to admit that I’m still familiarizing myself with this project. In fact, I’m still trying to wrap my head around how GE tests can complement a dbt testing suite (I’m actually gonna explore this for a client next week, so hopefully that should be clearer soon).
Anyways, there’s just a bunch of stuff I noted down while listening to this podcast that I thought might be of interest. They’re not fully formed ideas, but hopefully they’ll give you enough incentive to tune it and listen to this.
- Objective – Always know what to expect of your data
- Use as conversation tool to ask business what they expect the output of data to be
- dbt and GE – dbt for integrity of data and GE for business validations?
- Increase coverage in extension to other tools
- Could be used as alerting system with slack integration
- Next up, community driven gallery of solutions
Want a more visual introduction to Great Expectations? Have a look at this Strata 2018 conference presentation – Strata 2018: Pipeline Testing with Great Expectations.
Deriving insights from your product’s data.
I had shared an article from Snowplow two editions ago on multi-touch attribution models. It was a good overview article, but didn’t really go in-depth in regards to how to put this in place. This blog post by Mark Rittman goes into those depths.
I should mention first that Snowplow’s article was about multi-touch attribution whereas this article is about multi-channel attribution (for clarifications between those 2 types of models, have a read here – What is multi-touch attribution?). That clarified, and before getting into developing your own multi-channel attribution model, why generate your own attribution model?
When you control your own attribution model and how those numbers are reported you can switch between models that align with the goals that your business has at particular points in-time; for example using a first click attribution model if your priority is lead generation, last click if conversions are your priority or even time-decay, if you want to place more emphasis on touchpoint interactions closer to conversion.
Owning your data is always a sensible choice, but once you’ve made that choice, how to take advantage of it to develop your own multi-channel attribution model? Rittman lifts the veil on his approach, which of course involves dbt, but goes into what the data stack looks like and how modeling is being done.
I like how he’s also associating user’s “value” to channels, allowing you to measure which channels are bringing in new users that will end up generating more value. Elegant way to tackle an important question for product owners.
What’s happening the product analytics market.
This is a bit of a follow-up to our top story, with this announcement from Looker that they are now officially a part of Google Cloud. Same as with Snowflake, the integration of Looker into the Google Cloud started out as a partnership 4 years ago, followed by an acquisition in June 2019.
Looker says that it will remain committed to multi-cloud and “customers will continue to have the freedom to choose from any cloud data management system like Amazon Redshift, Azure SQL, Snowflake, Oracle, Microsoft SQL Server, Teradata and more”.
But in the grand scheme of things, and as we’ve seen with our stop story + other similar stories covered lately, there is certainly momentum towards consolidation. Modular components might be vendor-agnostic for now and play nicely with all other modules, but can we really bet that those vendor-ecosystems won’t close up eventually and lead to vendor lock-in?