Snowplow enables analysts to perform a wide variety of both simple and sophisticated analytics on your event data. In this section of the website, we provide examples and sample queries to perform many of those analyses.
Whilst it is possible to dive in and start crunching Snowplow data, it is helpful to understand the Snowplow data model and the Snowplow data pipeline. In this section, we cover these foundational concepts in detail.
The Snowplow data collection and enrichment process produces a data stream, where each data packet represents a single event. This is a rich data set and the possible applications of this data are endless. While it is common to do analysis against the event-level data, it is recommended to also aggregate data into smaller data sets.
Data modeling is the process aggregating event-level data into smaller data sets, while applying business logic (e.g. sessionization) and joining with other data sets.
Snowplow gives companies access to all data. The possible applications are endless. The data modeling step lowers the barrier to entry, and enables analysts to perform a wide variety of both simple and sohpisticated analyses using Snowplow data.
© 2012 - 2015 Snowplow Analytics