Amplitude Modeling dbt Package (Docs)
What does this dbt package do?
Produces modeled tables that leverage Amplitude data from Fivetran's connector in the format described by this ERD and builds off the output of our Amplitude source package.
Enables users to do the following:
- Leverage event data that is enhanced with additional event type and pivoted custom property fields for later downstream use
- View aggregated metrics for each unique session
- View aggregated metrics for each unique user
- View daily performance metrics for each event type
- Use the enhanced event data to leverage dbt metrics to generate additional analytics
- Incorporate the dbt Product Analytics package to further enhance Amplitude data, like for funnel and retention analysis
This package also generates a comprehensive data dictionary of your source and modeled Amplitude data through the dbt docs site. You can also refer to the table below for a detailed view of all tables materialized within this package by default.
Table | Description |
---|---|
amplitude__event_enhanced | Each record represents event data, enhanced with event type data and unnested event, group, and user properties. |
amplitude__sessions | Each record represents a distinct session with aggregated metrics for that session. |
amplitude__user_enhanced | Each record represents a distinct user with aggregated metrics for that user. |
amplitude__daily_performance | Each record represents performance metrics for each distinct day and event type. |
How do I use the dbt package?
Step 1: Prerequisites
To use this dbt package, you must have the following:
- At least one Fivetran Amplitude connector syncing data into your destination.
- A BigQuery, Snowflake, Redshift, PostgreSQL, or Databricks destination.
Databricks dispatch configuration
If you are using a Databricks destination with this package, you must add the following (or a variation of the following) dispatch configuration within your dbt_project.yml
file. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils
then the dbt-labs/dbt_utils
packages respectively.
dispatch:
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'dbt_utils']
Database Incremental Strategies
This package's incremental models are configured to leverage the different incremental strategies for each supported warehouse.
For BigQuery and Databricks All Purpose Cluster runtime destinations, we have chosen insert_overwrite as the default strategy, which benefits from the partitioning capability.
For Databricks SQL Warehouse destinations, models are materialized as tables without support for incremental runs.
For Snowflake, Redshift, and Postgres databases, we have chosen delete+insert as the default strategy.
Regardless of strategy, we recommend that users periodically run a --full-refresh to ensure a high level of data quality.
Step 2: Install the package
Include the following Amplitude package version in your packages.yml
file:
TIP: Check dbt Hub for the latest installation instructions, or read the dbt docs for more information on installing packages.
packages:
- package: fivetran/amplitude
version: [">=0.5.0", "<0.6.0"] # we recommend using ranges to capture non-breaking changes automatically
Do NOT include the amplitude_source
package in this file. The transformation package itself has a dependency on it and will install the source package as well.
Step 3: Define database and schema variables
By default, this package will run using your target database and the amplitude
schema. If this is not where your Amplitude data is, add the following configuration to your root dbt_project.yml
file:
# dbt_project.yml
...
config-version: 2
vars:
amplitude_database: your_database_name
amplitude_schema: your_schema_name
Step 4: Configure event date range
Because of the typical volume of event data, you may want to limit this package's models to work with a recent date range. However, note that the amplitude__daily_performance
, amplitude__event_enhanced
, and amplitude__sessions
final models are materialized as incremental tables.
The default date range for the stg_amplitude__event model starts at '2020-01-01' and ends one month past the current day. Conversely, for the date spine model in this package the default date range starts at 2020-01-01
and ends one day after the current day. To customize the date range, add the following configurations to your root dbt_project.yml
file:
# dbt_project.yml
...
vars:
amplitude__date_range_start: '2022-01-01' # your start date here
amplitude__date_range_end: '2022-12-01' # your end date here
If you adjust the date range variables, we recommend running dbt run --full-refresh
to ensure no data quality issues within the adjusted date range.
(Optional) Step 5: Additional configurations
Expand/collapse configurations
Lookback Window
Records from the source can sometimes arrive late. Since several of the models in this package are incremental, by default we look back 3 days from new records to ensure late arrivals are captured and avoiding the need for frequent full refreshes. While the frequency can be reduced, we still recommend running dbt --full-refresh
periodically to maintain data quality of the models.
To change the default lookback window, add the following variable to your dbt_project.yml
file:
vars:
amplitude:
lookback_window: number_of_days # default is 3
Change source table references
If an individual source table has a different name than the package expects, add the table name as it appears in your destination to the respective variable:
IMPORTANT: See the package's source
dbt_project.yml
variable declarations to see the expected names.
# dbt_project.yml
...
config-version: 2
vars:
<package_name>__<default_source_table_name>_identifier: your_table_name
Change build schema
By default, this package builds the GitHub staging models within a schema titled (<target_schema> + _stg_amplitude
) in your target database. If this is not where you would like your GitHub staging data to be written to, add the following configuration to your root dbt_project.yml
file:
# dbt_project.yml
models:
amplitude_source:
+schema: my_new_schema_name # leave blank for just the target_schema
Pivot out nested fields containing custom properties
The Amplitude schema allows for custom properties to be passed as nested fields (for example, user_properties: {"Cohort":"Test A"}
). To pivot out the properties, add the following configurations to your root dbt_project.yml
file:
# dbt_project.yml
...
vars:
event_properties_to_pivot: ['event_property_1','event_property_2']
group_properties_to_pivot: ['group_property_1','group_property_2']
user_properties_to_pivot: ['user_property_1','user_property_2']
(Optional) Step 6: Using this package with the dbt Product Analytics package
Expand for configurations
The dbt_product_analytics package contains macros that allows for further exploration such as event flow, funnel, and retention analysis. To leverage this in conjunction with this package, add the following configuration to your project's packages.yml
file:
packages:
- package: mjirv/dbt_product_analytics
version: [">=0.1.0"]
Refer to the dbt_product_analytics usage instructions and the example below:
-- # product_analytics_funnel.sql
{% set events =
dbt_product_analytics.event_stream(
from=ref('amplitude__event_enhanced'),
event_type_col="event_type",
user_id_col="amplitude_user_id",
date_col="event_day",
start_date="your_start_date",
end_date="your_end_date")
%}
{% set steps = ["event_type_1", "event_type_2", "event_type_3"] %}
{{ dbt_product_analytics.funnel(steps=steps, event_stream=events) }}
(Optional) Step 7: Orchestrate your models with Fivetran Transformations for dbt Core™
Expand for details
Fivetran offers the ability for you to orchestrate your dbt project through the Fivetran Transformations for dbt Core™ product. Refer to the linked docs for more information on how to setup your project for orchestration through Fivetran.
Does this package have dependencies?
This dbt package is dependent on the following dbt packages. These dependencies are installed by default within this package. For more information on the following packages, refer to the dbt hub site.
IMPORTANT: If you have any of these dependent packages in your own
packages.yml
file, we highly recommend that you remove them from your rootpackages.yml
to avoid package version conflicts.
packages:
- package: fivetran/amplitude_source
version: [">=0.3.0", "<0.4.0"]
- package: fivetran/fivetran_utils
version: [">=0.4.0", "<0.5.0"]
- package: dbt-labs/dbt_utils
version: [">=1.0.0", "<2.0.0"]
- package: dbt-labs/spark_utils
version: [">=0.3.0", "<0.4.0"]
How is this package maintained and can I contribute?
Package Maintenance
The Fivetran team maintaining this package only maintains the latest version of the package. We highly recommend you stay consistent with the latest version of the package and refer to the CHANGELOG and release notes for more information on changes across versions.
Opinionated Decisions
In creating this package, which is meant for a wide range of use cases, we had to take opinionated stances on a few different questions we came across during development. We've consolidated significant choices we made in the DECISIONLOG.md, and will continue to update as the package evolves. We are always open to and encourage feedback on these choices, and the package in general.
Contributions
These dbt packages are developed by a small team of analytics engineers at Fivetran. However, the packages are made better by community contributions.
We highly encourage and welcome contributions to this package. Check out this post on the best workflow for contributing to a package.
Are there any resources available?
- If you encounter any questions or want to reach out for help, see the GitHub Issue section to find the right avenue of support for you.
- If you would like to provide feedback to the dbt package team at Fivetran, or would like to request a future dbt package to be developed, then feel free to fill out our Feedback Form.