ETL Pipelines integration is available for the Scale plan.

With ETL Pipelines, Vital streams all Vital events continuously to a supported destination over direct authenticated TLS connections. Compared to Webhooks, this simplifies your operational complexity by removing the need to operate public unauthenticated HTTPS endpoints for receiving incoming Webhooks.

Scalable

Offload the pressure on your public HTTP services.

Secure

Events are published through authenticated channels over TLS.

Compressed

Data events larger than 1 KiB are compressed before publication.

Ordered

Publication order can be preserved for select destination types.

Destinations

The following destinations are supported:

NameAuthenticationPublication Order
RabbitMQPasswordDepends on your RabbitMQ broker configuration.
Google Cloud Pub/SubIAMPub/Sub Event Ordering is supported.
Azure Event HubSAS KeyDepends on your Azure Event Hub configuration.

Event Schema

Webhooks, ETL Pipelines and our API endpoints all share the same JSON object schema. Webhooks and ETL Pipelines use identical JSON event payload structure.

Check out our Event Catalog.

Features

Data compression

Vital does not compress payload blobs that are smaller than 1 KiB. Please refer to the destination-specific documentation below on how to detect a compressed blob versus an uncompressed blob.

Static outbound IPs

Vital establish connections to your ETL Pipeline destination through a static pool of IPs. Contact Vital support to obtain the up-to-date static IP list.

Double writing for migration

To ensure smooth switchover from Webhooks to your new ETL Pipelines destination, Vital supports double writing events to the combination of:

  • all your Webhook endpoints; and
  • your ETL Pipelines destination.

This is not a proper fan-out feature, and is only intended to aid customers evaluating and one-off migrating between destinations.

There can only be one preferred destination, and the secondary destination(s) do not enjoy the same reliability guarantee as the preferred one.

Pushed Historical Data

When you use ETL Pipelines, Vital can push all the historical data as daily.data.* data events to your ETL Pipelines destination. The data-less historical.data.*.created event would still be emitted to denote the completion of historical pull.

Configuring ETL Pipelines

You can change the ETL Pipelines configuration on your Teams at any time through the Set Team ETL Pipelines endpoint of the Org Management API.

Your new configuration is typically active as soon as the endpoint has acknowledged your request.

Feel free to reach out Vital support if you need assisted setup, or if you need a Vital Org Key to access the Org Management API.

Google Cloud Pub/Sub

Event Structure

Each event is published with these Cloud Pub/Sub message attributes:

TypeKeyValue
Attributeevent-typeVital event type, e.g., daily.data.activity.created
Attributeblob-typejson : UTF8 JSON document
Attributeblob-codecgzip: The blob is compressed by gzip. null or absent: The blob is uncompressed.
Attributeidempotency-keyThe event delivery ID; only unique amongst all events from the same user and of the same event type.
Ordering Key-Vital User ID (If you have enabled Event Ordering)

Payload smaller than 1KiB would not be compressed. You must check blob_codec to identify if decompression of the message payload blob is necessary.

At this time, Vital publishes exclusively JSON blobs (blob-type == json).

Having said that, you are strongly encouraged to check for and drop any events with unrecognized blob-type and blob-codec.

Pub/Sub event ordering

If you wish to receive event for each Vital user in their publication order, Vital supports Pub/Sub Event Ordering, and uses the Vital User ID as the event ordering key.

Vital publishes all events through the us-central1 and europe-west1 PubSub regional endpoints.

Pub/Sub Exactly Once Delivery is discouraged to be used together with Event Ordering. Our own trial suggested that the combination would struggle to move high volume of events.

IAM Permission

Grant the relevant IAM Principals on your topic with these pre-defined roles:

  • Pub/Sub Publisher (roles/pubsub.publisher)
  • Pub/Sub Viewer (roles/pubsub.viewer)

Or more specifically these permissions if you wish to create a custom IAM role with a minimized grant:

  • pubsub.topics.get
  • pubsub.topics.publish
EnvironmentRegionService Account
SandboxUScustomer-topics-us@vital-sandbox.iam.gserviceaccount.com
SandboxEuropecustomer-topics-eu@vital-sandbox.iam.gserviceaccount.com
ProductionUScustomer-topics-us@vital-prod-307415.iam.gserviceaccount.com
ProductionEuropecustomer-topics-eu@vital-prod-307415.iam.gserviceaccount.com

RabbitMQ

Event Structure

Each event is published with these RabbitMQ message custom headers:

KeyValue
Routing KeyVital event type, e.g., daily.data.activity.created
Content-Typeapplication/json : UTF8 JSON document
Content-Encodinggzip: The blob is compressed by gzip. null/absent: The blob is uncompressed.
DataEncoded JSON blob. This may or may not be compressed — see blob_codec.

Payload smaller than 1KiB would not be compressed. You must check Content-Encoding to identify if decompression of the message payload blob is necessary.

At this time, Vital publishes exclusively JSON blobs (blob_type == json).

Having said that, you are strongly encouraged to check for and drop any events with unrecognized Content-Type and Content-Encoding.

Broker authentication

Vital only supports password authentication at this time.

Vital publishes events with the following settings:

  • Publisher Confirm mode is required
  • Events are published with the Mandatory flag but without the Immediate flag.
  • Events are published with Event Type as the Routing Key.

Please get in touch if these need to be configured differently for your RabbitMQ exchange.

Azure Event Hub

Event Structure

Each event is published with these Azure EventData Properties:

TypeKeyValue
Propertyevent-typeVital event type, e.g., daily.data.activity.created
Propertyuser-idVital User ID of the event subject
Propertyteam-idVital Team ID of the event subject
Propertycontent-encodinggzip: The blob is compressed by gzip. absent: The blob is uncompressed.
Propertyidempotency-keyThe event delivery ID; only unique amongst all events from the same user and of the same event type.
Partition Key-Vital User ID
Data-The JSON blob. This may or may not be compressed — see content-encoding.

By default, payload smaller than 1KiB would not be compressed. You must check content-encoding to identify if decompression of the event payload blob is necessary.

To force Vital to always compress the payload, set compression: always on your Team ETL Pipeline configuration.

At this time, Vital publishes exclusively JSON blobs (blob_type == json).

Having said that, you are strongly encouraged to check for and drop any events with unrecognized Content-Type and Content-Encoding.

Broker authentication

Vital connects to your Azure Event Hub namespace through a connection URL you supplied in your Team ETL Pipeline configuration:

  • It must embed a Event Hub Shared Access Signature (SAS); and
  • It must not specify an Event Hub — the default Event Hub should be declared separately as part of your Team ETL Pipeline configuration.

Multiple Event Hubs

You can configure your Team ETL Pipeline to route certain events to designated Event Hubs that are not the default Event Hub.

You can do so by declaring a list of Event Hub Matchers (event_hub_matchers) in your configuration. For every outbound event, Vital sends it to the first ever Event Hub which has a matching Event Type prefix, or the default Event Hub if none matches.