Snowplow Analytics raises $10 million in Series A2 funding
Investment will fuel Snowplow’s transformation of behavioral data management and delivery
Every company should be able to collect and own their own data and data infrastructure. That’s why Snowplow’s technology is 100% open source.
Live on over 600,000 websites and countless mobile apps, Snowplow is the most widely adopted platform for collecting, processing and delivering behavioral data.
Why start from scratch when you can leverage tried and tested technology with all the benefits of BYO infrastructure?
Snowplow Open Source
A collection of data processing components that you can set up as a core data pipeline to collect behavioral data at scale across different applications, channels and touchpoints. Data teams can leverage the tech to make data available in a real-time stream, and in a data warehouse.
Free yourself from the limitations of packaged tools or inflexible legacy infrastructure with Snowplow’s best-in-class data delivery platform.
Designed as a lossless solution with a schema validation step upfront in the pipeline so you can ensure that your data is always accurate, complete and actionable.
Built from the ground up for flexibility and customizability – you define your data structures, data modeling logic and pipeline configurations, and easily evolve them alongside your business.
You have complete ownership of your data, with total control over and visibility into your data processing pipeline. Your data never leaves your cloud environment.
Open source challenge: The real time pipeline is a collection of data processing components without built-in monitoring and scaling. To monitor your pipeline, ensure uptime and keep latency down, you would need to hire SREs across multiple time zones. Businesses often underestimate the resources it takes to ensure this in-house.
Snowplow Insights solution: We offer custom SLAs on uptime, latency and support response time. If any of these are breached, it’s on us. Support engineers work across all time zones to make sure your pipeline runs smoothly, 24/7. This means your team can focus on driving value from your data.
Open source challenge: Keeping your tech up to date with the latest AWS and GCP changes is difficult to manage in house, especially with big updates such as moving from batch to a real-time pipeline.
Snowplow Insights solution: We set you up with the latest Snowplow tech and manage it for you, so you don’t have to worry about falling behind updates. As true experts in this technology, we upgrade it for you several times a year and work full time to improve the pipeline and adapt it to the latest changes. For example, Insights customers were upgraded to a real-time pipeline when AWS deprecated support for Tomcat 8 on Elastic Beanstalk in March, 2020.
Open source challenge: Companies often underestimate the difficulty in setting up and maintaining the open source tech. Snowplow OS is a collection of data processing components, without a common protocol, and to set up the pipeline correctly takes significant effort. We often see sub-optimal configurations when customers move from OS to Insights.
Snowplow Insights solution: Solutions Architects embed themselves in your business to take you through a custom on-boarding process, set up your pipeline to match your use case and business, and provide tailored training to your teams. Solutions Architects also work with you on tracking design and SQL data modeling, architecture and deployment.
Open source challenge: OS users do not have a way to monitor and manage data quality. This means many failed events go unresolved, leading to incomplete or inaccurate data.
Snowplow Insights solution: From the Snowplow Insights Console, you can keep track of data quality, explore failed events and pinpoint the source of bad data. You can also update and edit your data structures, monitor the health of your pipeline and easily configure enrichments – all from one place.
|Feature||Benefit||Snowplow Insights||Snowplow Open Source|
|The Snowplow Technology||Snowplow’s best-in-class data processing technology running in your cloud account.||Yes||Yes|
| Complete ownership
& zero vendor lock-in
|Full ownership of your data, data models and pipeline components, even if you decide to switch solutions.||Yes||Yes|
|Full transparency|| Every stage of your pipeline is auditable
with complete access to the underlying
| Monitoring, scaling
|Snowplow takes care of everything from
installation and upgrades to monitoring and
autoscaling so your team can focus on
business goals and leave the maintenance of
your pipeline to the Snowplow experts.
| Implementation &
| Tailored onboarding and training, co-design
of your tracking strategy and SQL data
model, and continued access to 24/7
|Data quality UI and API|| Automatic alerting on the emergence of
new data quality issues. Structured
workflow to diagnose and resolve
data quality issues.
| Pipeline configuration
and monitoring UI
| Easy user interface to setup and monitor
your data pipeline, and safely apply
configuration changes once running.
| Data structures UI and
| Structured workflow to define, govern
and evolve your data structures.
|Uptime & latency SLAs||Guaranteed collector uptime and data
delivery latency so your engineers don’t
need to take on the stressful task of
ensuring uptime and low-latency.