Azure Data Lake Storage
Azure Data Lake Storage [ADLS] is a cloud-based, scalable data storage solution for big data analytics. ADLS allows you to store and manage massive amounts of data in any format. Fivetran supports data lakes built on ADLS as a destination.
Setup guide
Follow our step-by-step Azure Data Lake Storage setup guide to connect your Azure Data Lake Storage destination with Fivetran.
Type transformation and mapping
The data types in your Azure Data Lake Storage follow Fivetran's standard data type storage.
We use the following data type conversions:
Fivetran Data Type | Destination Data Type |
---|---|
BOOLEAN | BOOLEAN |
SHORT | SHORT |
INT | INTEGER |
LONG | LONG |
BIGDECIMAL | DECIMAL(38, 10) |
FLOAT | FLOAT |
DOUBLE | DOUBLE |
LOCALDATE | DATE |
INSTANT | TIMESTAMP |
STRING | STRING |
XML | STRING |
JSON | STRING |
BINARY | BINARY |
NOTE: Fivetran stores hex encoded BINARY values in your destination. You can use the unhex function,
decode(unhex(colNAme), 'UTF-8')
, in your queries to fetch the decoded BINARY values.
Supported query engines
To extract data from your ADLS destination, use the following query engines:
To extract data using Azure Synapse Analytics, use the following query engines:
Data format
Fivetran stores your data in a structured format in the destination. We write your source data to Parquet files in the Fivetran pipeline and use Delta Lake format to store these files in the data lake.
Folder structure
We can sync your data to any destination folder of your choice. If you specify the prefix path, we write your data to the following directory: <root>/<prefix_path>/<schema>/<table>
. If you do not specify the prefix path, we create a folder and set the prefix path to fivetran
by default. We then write your data to the following directory: <root>/<fivetran>/<schema>/<table>
.
Unity Catalog
You can create external tables in Databricks Unity Catalog for your ADLS data and query your data from these external tables. For more information about integrating Unity Catalog with your ADLS destination, see our Unity Catalog Setup Guide.
NOTE: Databricks uses the table definition to understand the structure of the data. It stores the metadata of the tables in the metastore and allows us to interact with them like regular tables within Databricks by accessing the data in its original location.
Table maintenance operations
We perform the following maintenance operations on the Delta Lake tables in your destination:
- Delete old snapshots: We delete the table snapshots that are older than the Snapshot Retention Period you specify in the destination setup form. However, we always retain the last 4 checkpoints of a table before deleting its snapshots.
- Delete orphan and removed files: Orphan files are created because of unsuccessful operations within your data pipeline. The orphan files are stored in your ADLS container but are no longer referenced in the Delta Lake table metadata. Removed files are the files that are not referenced in the latest table snapshots but were referenced in the older snapshots. These orphan and removed files contribute to your ADLS subscription costs. We identify such files that are older than 7 days and delete them in regular intervals of 2 weeks to maintain an efficient data storage environment.
NOTE: You may observe a sync delay for your connectors while the table maintenance operations are in progress. To ensure a seamless experience with minimal sync delays, we perform the table maintenance operations only on Saturdays.
Limitations
Fivetran does not support the history mode for ADLS destination.
Fivetran creates DECIMAL columns with maximum precision and scale (38, 10).
Spark SQL pool queries cannot read the maximum values of DOUBLE and FLOAT data types.
Spark SQL pool queries truncate the TIMESTAMP values to seconds. To query any table using a TIMESTAMP column, you can use the
unixtime(unix_timestamp(<col_name>, 'yyyy-MM-dd HH:mm:ss.SSS'),'yyyy-MM-dd HH:mm:ss.ms')
clause in your queries to get the accurate values, including milliseconds and microseconds.Fivetran does not support the Change Data Feed feature for Delta Lake tables. You must not enable Change Data Feed for the Delta Lake tables that Fivetran creates in your ADLS destination.
Unity Catalog limitations
To query any table using an XML column, you can use a regex.
To filter timestamp values, you can use the
string(<col_name>)
clause to get the accurate values.