Pipeline Components and Applications

  1. Home
  2. Docs
  3. Pipeline Components and Applications
  4. Hosted assets

Hosted assets

To simplify setting up and running Snowplow, the Snowplow Analytics team provide public hosting for some of the Snowplow sub-components. These hosted assets are publicly available through Amazon Web Services (CloudFront and S3), and using them is free for Snowplow community members.

As we release new versions of these assets, we will leave old versions unchanged on their existing URLs – so you won’t have to upgrade your own Snowplow installation unless you want to.

Disclaimer: While Snowplow Analytics Ltd will make every reasonable effort to host these assets, we will not be liable for any failure to provide this service. All of the hosted assets listed below are freely available via our GitHub repository and you are encouraged to host them yourselves.

The current versions of the assets hosted by the Snowplow Analytics team are as follows:

0. Snowplow CLI

We are steadily moving over to Bintray for hosting binaries and artifacts which don’t have to be hosted on S3.

To make operating Snowplow easier, the EmrEtlRunner app are now available as prebuilt executables in a single zipfile here:

http://dl.bintray.com/snowplow/snowplow-generic/snowplow_emr_r117_biskupin.zip

Right-click on this Download link to save it down locally.

Note: The link above refers to the latest version at the time of writing (R117). If you know there is a newer version you can locate and download it from the generic page. Search for the pattern snowplow_emr_. The higher the number version the newer it is.

1. Trackers

1.1 JavaScript Tracker resources

We recommend you follow our guide to Self hosting and host the latest version of sp.js available from GitHub, or use the public jsDelivr or cdnjs CDNs.

2. Collectors

2.1 Clojure Collector resources

The Clojure Collector packaged as a complete WAR file, ready for Amazon Elastic Beanstalk, is here:

s3://snowplow-hosted-assets/2-collectors/clojure-collector/clojure-collector-2.1.2-standalone.war

Right-click on this Download link to save it down locally via CloudFront CDN.

2.2 Scala Stream Collector resources

The Scala Stream Collectors are available on Bintray here:

https://bintray.com/snowplow/snowplow-generic/snowplow-scala-stream-collector/1.0.0#files

Choose an artifact according to the supported targeted platform (Kinesis, Google Pub/Sub, Kafka or NSQ).

You can also find the images on Docker Hub:

3. Enrich

3.1 Spark Enrich resources

The Spark Enrich process uses a single jarfile containing the Spark job. This is made available in a public Amazon S3 bucket, for Snowplowers who are running their Spark Enrich process on Amazon EMR:

s3://snowplow-hosted-assets/3-enrich/spark-enrich/snowplow-spark-enrich-1.19.0.jar

Right-click on this Download link to save it down locally via CloudFront CDN.

3.2 Stream Enrich resources

The Stream Enrich app is available on Bintray here:

https://bintray.com/snowplow/snowplow-generic/snowplow-stream-enrich/1.0.0#files

Choose an artifact according to the supported targeted platform:

  • Kinesis
  • Kafka
  • NSQ

You can also find the images on Docker Hub:

3.3 Beam Enrich resources

Beam Enrich can be found on Bintray:

You can also find the image on Docker Hub: beam-enrich

3.4 Scala Hadoop Event Recovery resources

The Scala Hadoop Event Recovery (formerly Hadoop Bad Rows) tool uses a single jarfile containing the MapReduce job. This is made available in a public Amazon S3 bucket:

s3://snowplow-hosted-assets/3-enrich/hadoop-event-recovery/snowplow-hadoop-event-recovery-0.2.0.jar

Right-click on this Download link to save it down locally via CloudFront CDN.

3.5 Shared resources

3.5.1 User-Agent parser database

The UA parser enrichment makes use of the uap-core database, also stored in this public Amazon S3 bucket:

s3://snowplow-hosted-assets/third-party/ua-parser/regexes.yaml

This file is updated every month by the Snowplow Analytics team. See this table for your bucket.

4. Storage

4.1 Relational Database Shredder resources

The Relational Database Shredder process uses a single jarfile containing the Spark job. This is made available in a public Amazon S3 bucket, for Snowplowers who are running their Spark Enrich & Shred process on Amazon EMR:

s3://snowplow-hosted-assets/4-storage/rdb-shredder/snowplow-rdb-shredder-0.13.0.jar

Right-click on this Download link to save it down locally via CloudFront CDN.

4.2 Redshift Storage resources

Our shredding process for loading JSONs into Redshift uses a standard set of JSON Path files, available here:

s3://snowplow-hosted-assets/4-storage/redshift-storage/jsonpaths

If you are running RDB Loader, these files will automatically be used for loading corresponding JSONs into Redshift.

4.3 Elasticsearch Loader resources

The Elasticsearch Loader app is available for both Elasticsearch APIs (HTTP and transport) on Bintray here:

http://dl.bintray.com/snowplow/snowplow-generic/snowplow_elasticsearch_loader_http_0.10.1.zip http://dl.bintray.com/snowplow/snowplow-generic/snowplow_elasticsearch_loader_tcp_0.10.1.zip http://dl.bintray.com/snowplow/snowplow-generic/snowplow_elasticsearch_loader_tcp_2x_0.10.1.zip

Right-click on:

  • this Download link for the version using the HTTP API which works with every Elasticsearch version
  • this Download link for the version using the transport API for 5.x clusters
  • this Download link for the version using the transport API for 2.x clusters

4.4 Snowplow S3 Loader resources

The Snowplow S3 Loader app is available for download separately here:

http://dl.bintray.com/snowplow/snowplow-generic/snowplow_s3_loader_0.6.0.zip

Right-click on this Download link to save it down locally.

5. Analytics

No hosted assets currently.

6. Relays (AWS Lambda)

You can find the jars for our relays in our public S3 buckets close to your region (see this table for your bucket)

6.1 Snowplow Piinguin Relay

You can find the Snowplow Piinguin Relay in an S3 bucket in your region (see this table for your bucket)

For instance for eu-west-1 the asset will be at: s3://snowplow-hosted-assets/relays/piinguin/snowplow-piinguin-relay-0.1.0.jar

6.2 Snowplow Indicative Relay

You can find the Snowplow Indicative Relay in an S3 bucket in your region (see this table for your bucket)

For instance for eu-west-1 the asset will be at: s3://snowplow-hosted-assets/relays/indicative/indicative-relay-0.1.0.jar

7. S3 hosted asset bucket per region

RegionBucket
eu-west-1snowplow-hosted-assets
us-east-1snowplow-hosted-assets-us-east-1
us-west-1snowplow-hosted-assets-us-west-1
us-west-2snowplow-hosted-assets-us-west-2
sa-east-1snowplow-hosted-assets-sa-east-1
eu-central-1snowplow-hosted-assets-eu-central-1
ap-southeast-1snowplow-hosted-assets-ap-southeast-1
ap-southeast-2snowplow-hosted-assets-ap-southeast-2
ap-northeast-1snowplow-hosted-assets-ap-northeast-1
ap-south-1snowplow-hosted-assets-ap-south-1
us-east-2snowplow-hosted-assets-us-east-2
ca-central-1snowplow-hosted-assets-ca-central-1
eu-west-2snowplow-hosted-assets-eu-west-2
ap-northeast-2snowplow-hosted-assets-ap-northeast-2

See also

As well as these hosted assets for running Snowplow, the Snowplow Analytics team also make code components and libraries available through Ruby and Java artifact repositories.

Please see the Artifact repositories wiki page for more information.