Release Notes
August 2024
We now segregate staging schemas by destination to work around the limitation of the number of allowed tables in a single schema. We now use fivetran_<group_id>_staging
as our staging schema name. Previously, we used _fivetran_staging
as the staging schema name.
We recommend that you grant permission to create the schemas or provide the required permissions to the specific schemas. Grant the following permissions:
- CREATE TABLE
- MODIFY
- SELECT
June 2024
Databricks on Azure destinations with OAuth authentication don't support Transformations for dbt Core* and Quickstart models.
* dbt Core is a trademark of dbt Labs, Inc. All rights therein are reserved to dbt Labs, Inc. Fivetran Transformations is not a product or service of or endorsed by dbt Labs, Inc.
April 2024
We now support the OAuth machine-to-machine (M2M) authentication type to connect to the Databricks destinations hosted on AWS and Azure. You can use this authentication type only for the destinations that are not connected to Fivetran using AWS PrivateLink and Azure Private Link.
To support this enhancement, we have made the following changes:
- Added a new drop-down menu, Authentication Type, to the setup form. The drop-down menu allows you to choose the authentication method you want Fivetran to use.
- Added two new fields, OAuth 2.0 Client ID and OAuth 2.0 Secret, to the destination setup form.
- Added two new parameters,
oauth2_client_id
andoauth2_secret
, to the Fivetran REST API.
For more information, see our setup instructions and REST API documentation. We are gradually rolling these changes to all existing destinations.
We now sync data from the columns whose name start with a numeric character as Databricks has started supporting such column names. Previously, we did not sync any data from columns whose name started with a numeric character. To backfill the historical data for these columns, re-sync the associated tables. We are gradually rolling this change to all existing destinations.
October 2023
We no longer enforce the column names to be in lower case. We now ignore the case of the column names in your destination tables. Your queries and scripts will continue to execute because Databricks is case-insensitive.
NOTE: If you have previously synced a column name with upper case letters to your destination table and if a column data type changes in the table, we will rename all the columns in the table to lower case.
March 2023
For the service
parameter in the API request for our Databricks destination, you now need to specify databricks
as a value. The old values databricks_aws
and databricks_azure
are still valid.
August 2022
Our Databricks destination now supports the Unity Catalog feature. For more information, see our setup instructions.
To upgrade your existing tables and schemas to Unity Catalog, see the following:
March 2022
We now use Azure Blob Storage as the staging location for our temporary files if:
- Your Databricks cluster is hosted on Azure
- Your Databricks cluster is a SQL Endpoint cluster
- Your Databricks cluster is a general purpose cluster with DBR version 10.2 or greater
November 2021
Our Databricks destination now supports clusters with Databricks Runtime versions 9.0 - 10.x.
June 2021
Our Databricks destination now supports Databricks SQL endpoint connections. For more information, see our setup instructions.
We now support Databricks on Google Cloud.
April 2021
Our Databricks destination now supports clusters with Databricks Runtime 8.0 and above.
March 2021
Our Databricks destination is now generally available.
Read our Databricks destination documentation.
December 2020
We have reduced the sync duration of append-only tables that have primary key columns of INT
, SMALLINT
, or BIGINT
data type. During internal performance testing, we observed a reduction in sync time for destinations with large table sizes.
Now dbt Transformations support the Databricks destination. If you use Databricks as a destination, you can just set up your dbt Transformations using our guide; no additional steps are required.
August 2020
Our Databricks REST API endpoint now supports the creation of external tables. You can now opt to create Delta tables as external tables for your Databricks implementations.
Our Databricks destination now supports the creation of external tables. You can now opt to create Delta tables as external tables from the connector setup form.
We now support syncing the BINARY data type from your source.
July 2020
We now use our own Amazon S3 bucket as an intermediate storage for staging temporary data during a sync. Now, when setting up Databricks as your destination, you do not have to create a S3 bucket.
We will end support for clusters with Databricks Runtime 7.0 and below, on August 15, 2020. To prevent your integrations from failing or causing data loss, upgrade your Databricks Runtime to 7.1 before August 15, 2020.
June 2020
We now replicate empty tables in a PostgreSQL source database as empty tables in the destination.
February 2020
We have added Databricks as one of our supported destinations. You can now use Databricks as your destination with Fivetran connectors.