The Snowplow Analytics cookbook

Snowplow enables analysts to perform a wide variety of both simple and sophisticated analytics on your event data. In this section of the website, we provide examples and sample queries to perform many of those analyses.

Foundational concepts

Whilst it is possible to dive in and start crunching Snowplow data, it is helpful to understand the Snowplow data model and the Snowplow data pipeline. In this section, we cover these foundational concepts in detail.

  1. Events
  2. Dictionaries and schemas
  3. Contexts
  4. Iglu
  5. Stages in the Snowplow data pipeline
  6. Sending data into Snowplow
  7. Viewing event-level data in Snowplow

Data Modeling in Snowplow

The Snowplow data collection and enrichment process produces a data stream, where each data packet represents a single event. This is a rich data set and the possible applications of this data are endless. While it is common to do analysis against the event-level data, it is recommended to also aggregate data into smaller data sets.

Data modeling is the process aggregating event-level data into smaller data sets, while applying business logic (e.g. sessionization) and joining with other data sets.

  1. Data modeling

Performing Web Analysis using Snowplow data

Snowplow gives companies access to all data. The possible applications are endless. The data modeling step lowers the barrier to entry, and enables analysts to perform a wide variety of both simple and sohpisticated analyses using Snowplow data.

  1. Customer analytics. Understand your customers and users.
  2. Catalog analytics. Understand the different ways content items (articles / videos) and products in your catalog drive user behavior and value.
  3. Platform analytics. Understand how updates to your application change user behavior and grow value.

Tools and Techniques

  1. Tools and techniques. Useful techniques to employ with Snowplow data across a range of analyses.