Unity catalog with Hive#
The Hive connector supports the INSERT
write operation and reading from
external tables
when using the Databricks Unity Catalog as a metastore on AWS, Azure, or GCP.
The file formats used by Databricks tables CSV, JSON, AVRO, PARQUET, ORC, TEXT are mapped to the Hive table format with the corresponding file format in Starburst.
Configuration#
To use Unity Catalog metastore, add the following configuration properties to your catalog configuration file:
hive.metastore=unity
hive.security=read_only
hive.metastore.unity.host=host
hive.metastore.unity.token=token
hive.metastore.unity.catalog-name=main
The following table shows the configuration properties used to connect SEP to Unity Catalog as a metastore.
Property name |
Description |
---|---|
|
Name of the host without http(s) prefix. For example:
|
|
The personal access token used to authenticate a connection to the Unity Catalog metastore. For more information about generating access tokens, see the Databricks documentation. |
|
Name of the catalog in Databricks. |