Pipeline Components and Applications

  1. Home
  2. Docs
  3. Pipeline Components and Applications
  4. Hosted assets

Hosted assets

To simplify setting up and running Snowplow, the Snowplow Analytics team provide public hosting for some of the Snowplow sub-components. These hosted assets are publicly available through Amazon Web Services (CloudFront and S3), and using them is free for Snowplow community members. Some artifacts and binaries which do not need to be hosted on S3 are available via Github releases, though that is being slowly deprecated in favour of Docker images hosted on Docker Hub.

As we release new versions of these assets, we will leave old versions unchanged on their existing URLs – so you won’t have to upgrade your own Snowplow installation unless you want to.

Disclaimer: While Snowplow Analytics Ltd will make every reasonable effort to host these assets, we will not be liable for any failure to provide this service. All of the hosted assets listed below are freely available via our GitHub repository and you are encouraged to host them yourselves.

The current versions of the assets hosted by the Snowplow Analytics team are as follows:

0. Snowplow CLI

To make operating Snowplow easier, the EmrEtlRunner app are now available as prebuilt executables in a single zipfile here:

https://github.com/snowplow/emr-etl-runner/releases/download/1.0.4/snowplow_emr_1.0.4.zip
Code language: JavaScript (javascript)

Right-click on this Download link to save it down locally.

Note: The link above refers to the latest version at the time of writing (1.0.4). If you know there is a newer version you can locate and download it from the releases page.

1. Trackers

1.1 JavaScript Tracker resources

We recommend you follow our guide to Self hosting and host the latest version of sp.js available from GitHub, or use the public jsDelivr or cdnjs CDNs.

2. Scala Stream Collector

Docker images for each message queue can be found on Docker Hub:

3. Enrich

3.1 Stream Enrich

Docker images for each message queue can be found on Docker Hub :

3.2 Beam Enrich resources

Docker image can be found on Docker Hub.

3.3 Shared resources

3.3.1 User-Agent parser database

The UA parser enrichment makes use of the uap-core database, also stored in this public Amazon S3 bucket:

s3://snowplow-hosted-assets/third-party/ua-parser/regexes.yaml
Code language: JavaScript (javascript)

This file is updated every month by the Snowplow Analytics team. See this table for your bucket.

4. Storage

4.1 Relational Database Shredder resources

The Relational Database Shredder process uses a single jarfile containing the Spark job. This is made available in a public Amazon S3 bucket, for Snowplowers who are running their Spark Enrich & Shred process on Amazon EMR:

s3://snowplow-hosted-assets/4-storage/rdb-shredder/snowplow-rdb-shredder-0.18.2.jar
Code language: JavaScript (javascript)

Right-click on this Download link to save it down locally via CloudFront CDN.

4.2 Redshift Storage resources

Our shredding process for loading JSONs into Redshift uses a standard set of JSON Path files, available here:

s3://snowplow-hosted-assets/4-storage/redshift-storage/jsonpaths
Code language: JavaScript (javascript)

If you are running RDB Loader, these files will automatically be used for loading corresponding JSONs into Redshift.

4.3 Elasticsearch Loader resources

The Elasticsearch Loader app is available as a docker image on Docker Hub or as an asset to be downloaded from the Github releases.

4.4 Snowplow S3 Loader resources

The Snowplow S3 Loader app is available as a docker image from Docker Hub or as an asset to be downloaded from the Github releases.

4.5 Snowflake Loader

Fat jar for v0.8.0 can be downloaded via HTTP here.

5. Analytics

No hosted assets currently.

6. Relays (AWS Lambda)

You can find the jars for our relays in our public S3 buckets close to your region (see this table for your bucket)

6.1 Snowplow Piinguin Relay

You can find the Snowplow Piinguin Relay in an S3 bucket in your region (see this table for your bucket)

For instance for eu-west-1 the asset will be at: s3://snowplow-hosted-assets/relays/piinguin/snowplow-piinguin-relay-0.1.0.jar

6.2 Snowplow Indicative Relay

You can find the Snowplow Indicative Relay in an S3 bucket in your region (see this table for your bucket)

For instance for eu-west-1 the asset will be at: s3://snowplow-hosted-assets/relays/indicative/indicative-relay-0.3.0.jar

7. S3 hosted asset bucket per region

RegionBucket
eu-west-1snowplow-hosted-assets
us-east-1snowplow-hosted-assets-us-east-1
us-west-1snowplow-hosted-assets-us-west-1
us-west-2snowplow-hosted-assets-us-west-2
sa-east-1snowplow-hosted-assets-sa-east-1
eu-central-1snowplow-hosted-assets-eu-central-1
ap-southeast-1snowplow-hosted-assets-ap-southeast-1
ap-southeast-2snowplow-hosted-assets-ap-southeast-2
ap-northeast-1snowplow-hosted-assets-ap-northeast-1
ap-south-1snowplow-hosted-assets-ap-south-1
us-east-2snowplow-hosted-assets-us-east-2
ca-central-1snowplow-hosted-assets-ca-central-1
eu-west-2snowplow-hosted-assets-eu-west-2
ap-northeast-2snowplow-hosted-assets-ap-northeast-2

See also

As well as these hosted assets for running Snowplow, the Snowplow Analytics team also make code components and libraries available through Ruby and Java artifact repositories.

Please see the Artifact repositories wiki page for more information.