1. Home
  2. Docs
  3. Destinations
  4. Indicative


There are two ways you can leverage your Snowplow data in Indicative:

  • A direct integration with your data in S3 (on AWS), BigQuery (GCP) or Snowflake. For more information on this, contact Indicative directly.
  • The Snowplow Indicative relay that sends the data directly from Kinesis to Indicative (AWS only).

Depending on your use case and setup you might want to use either integration. For more information, reach out to Indicative directly.

Please note that the remainder of this documentation is for the Snowplow Indicative relay.

Why Indicative?

Indicative is the only behavioral analytics platform designed for business users to understand and analyze the full customer journey. Indicative connects to behavioral data sources to allow businesses to easily conduct sophisticated analysis of how customers are interacting with their business and allows them to maximize customer value. Indicative gives product managers and marketers:

  • the ability to analyse customer journeys without any SQL knowledge
  • the ability to build custom and complex funnels with variable exit points and steps
  • the ability to easily build and analyse user segments

Connecting Snowplow to Indicative

1. Create Your Indicative Account 

If you do not have an Indicative account, first go to Indicative to register. 

2. Configure your Indicative API key 

Select Snowplow as your data source and copy your Indicative API Key. 

3. Share your Indicative API key with Snowplow 

Log-in to your Snowplow Console and send your API key securely through freeform messaging under Credentials Sharing. Make sure to provide some context: “I am sharing my Indicative API key <key> to get my Snowplow data running to Indicative”. Snowplow Support will alert you once the data starts flowing. 

Indicative API screen

4. Validate your data 

Go to your Indicative project to check if you are receiving data. You can also go to the debug console to troubleshoot the relay in real time.

Additional Configuration 

By default all of your Snowplow data will go to your Indicative implementation. We do however support some filtering options:

  • Unused events
  • Unused atomic events fields
  • Unused contexts

If you think that you’d like to filter out certain events, atomic fields or contexts just let us know and we’ll be happy to set that up for you.

So-called ‘structured events’ have their event_name field’s value set to event in the Snowplow canonical event model. This can be confusing, and many users prefer to designate one of those events’ special fields (most commonly se_action) to be the ‘event name field’. This is also the default in the Indicative relay. Please get in touch if you would like to use a different field, such as se_category or any other.