Snowflake catalogs #
You can use a Snowflake catalog to configure access to Snowflake in the following deployments:
Follow these steps to begin creating a catalog for Snowflake:
- In the navigation menu, select Catalogs.
- Click Create catalog.
On the Select a data source pane, click the Snowflake icon.
- Follow the instructions in the next sections to configure your Snowflake connection.
Select a cloud provider #
The Cloud provider configuration is necessary to allow Starburst Galaxy to correctly match catalogs and clusters.
The data source configured in a catalog, and the cluster must operate in the same cloud provider and region for performance and cost reasons.
If you select the Google Cloud icon, a pop-up note reminds you that Snowflake can only be used with Google Cloud as a read only catalog. No write operations can be performed.
Define catalog name and description #
The Name of the catalog is visible in the Query editor and other clients. It is used to identify the catalog when writing SQL or showing the catalog and its nested schemas and tables in client applications.
The name is displayed in the Query editor, and when running a SHOW
CATALOGS command. It is used to fully
qualify the name of any table in SQL queries following the
catalogname.schemaname.tablename syntax. For example, you can run the
following query in the sample cluster without first setting the catalog or
SELECT * FROM tpch.sf1.nation;
The Description is a short, optional paragraph that provides further details about the catalog. It appears in the Starburst Galaxy user interface and can help other users determine what data can be accessed with the catalog.
Configure the connection #
The connection to the database requires username and password authentication and details to connect to the database server, typically hostname or IP address and port. The following sections detail the setup for the supported cloud providers.
Snowflake configuration #
The user credentials must belong to a user that has been granted permissions to use the Snowflake role with the required access rights.
To configure the connection to your Snowflake cluster, you must provide the following details:
Snowflake account identifier: which specifies your Snowflake account ID in a URL-friendly lowercase form, where
<account_name>can include additional
<cloud_region_id>.<cloud>segments that identify the cloud platform and region where your account is hosted. The identifier can take different forms, depending on the specifics of your Snowflake account. The most common forms are:
See account identifiers to learn more about Snowflake identifier formats.
Username: enter a Snowflake username.
Password: specify the password for this username.
Database name: specify the database you want to connect to.
Warehouse name: specify the warehouse you want to connect to.
Snowflake role: select a role with the required access rights.
The Enable parallel mode toggle is disabled by default. Leave disabled to continue using single-thread query mode, which conserves resources. Otherwise, click to enable parallel mode.
With parallel mode enabled, this catalog takes advantage of a feature of the Snowflake JDBC driver to fetch query results over multiple parallel connections. For most queries, this translates to faster performance.
Test the connection #
Once you have configured the connection details, click Test connection to confirm data access is working. If the test is successful, you can save the catalog.
If the test fails, look over your entries in the configuration fields, correct any errors, and try again. If the test continues to fail, Galaxy provides diagnostic information that you can use to fix the data source configuration in the cloud provider system.
Connect catalog #
Click Connect catalog, and proceed to set permissions where you can grant access to certain roles.
Set permissions #
This optional step allows you to configure read access, read only access, and full read and write access to the catalog.
Setting read only permissions grants the specified roles read only access to the catalog. As a result users have read only access to all contained schema, tables, and views.
Setting read/write permissions grants the specified roles full read and write access to the catalog. As a result users have full read and write access to all contained schema, tables, and views.
Use the following steps to assign read/write access to roles:
- In the Role-level permissions section, expand the menu in the Roles with read and write access field.
- From the list, select one or more roles to grant read and write access to.
- Expand the menu in the Roles with read access field.
- Select one or more roles from the list to grant read access to.
Click Save access controls.
Add to cluster #
You can add your catalog to a cluster later by editing a cluster. Click Skip to proceed to the catalogs page.
Use the following steps to add your catalog to an existing cluster or create a new cluster in the same cloud region:
- In the Add to cluster section, expand the menu in the Select cluster field.
- Select one or more existing clusters from the drop down menu.
- Click Create a new cluster to create a new cluster in the same region, and add it to the cluster selection menu.
Click Add to cluster to view your new catalog’s configuration.
The Pending changes to clusters dialog appears when you try to add a catalog to a running cluster.
- In the Pending changes to cluster dialog, click Return to catalogs to edit the catalog or create a new catalog.
- Click Go to clusters to confirm the addition of the catalog to the running cluster.
On the Clusters page, click the Update icon beside the running cluster, to add the catalog.
SQL support #
The catalog provides read access and write access to data and metadata in the Snowflake database. It supports the following features:
- Globally available statements
- Read operations
- Write operations
The following section provides details for the SQL support with Snowflake catalogs.
Data management details #
WHERE clause is specified, the
DELETE operation only works if the
predicate in the clause can be fully pushed down to the data source.
Is the information on this page helpful?