Starburst SAP HANA connector#
The SAP HANA connector allows querying and creating tables in an external database. Connectors let Starburst Enterprise platform (SEP) join data provided by different databases, like SAP HANA and Hive, or different database instances.
Requirements#
To connect to SAP HANA, you need:
SAP HANA version 2.0 or higher.
Network access from the coordinator and workers to the SAP HANA server. Port 30015 the default port for instance 00.
A valid Starburst Enterprise license.
Configuration#
Get the JDBC driver for SAP HANA from SAP
Add the JDBC driver JAR file to the SEP
plugin/sap-hana
directory on all the nodesRestart SEP on every node
To configure the SAP HANA connector as the myhanadb
catalog, create a file
named myhanadb.properties
in etc/catalog
:
connector.name=sap-hana
connection-url=jdbc:sap://Hostname:Port/?optionalparameters
connection-user=USERNAME
connection-password=PASSWORD
Refer to the SAP HANA for more information about format and parameters of the JDBC URL supported by the SAP HANA JDBC driver.
Type mapping#
Because SEP and SAP HANA each support types that the other does not, this connector modifies some types when reading or writing data.
SAP HANA to SEP read type mapping#
The following read type mapping applies when data is read from existing tables in SAP HANA, or inserted into existing tables in SAP HANA from SEP.
SAP HANA database type |
SEP type |
Notes |
---|---|---|
BOOLEAN |
BOOLEAN |
|
TINYINT |
TINYINT |
|
SMALLINT |
SMALLINT |
|
INTEGER |
INTEGER |
|
BIGINT |
BIGINT |
|
REAL |
REAL |
|
DOUBLE |
DOUBLE |
|
FLOAT(p) |
REAL for p <= 24, DOUBLE otherwise |
|
DECIMAL(p, s) |
DECIMAL(p, s) |
|
DECIMAL |
DOUBLE |
SAP HANA’s DECIMAL with precision and scale not specified represents a floating-point decimal number |
SMALLDECIMAL |
DOUBLE |
SAP HANA’s DECIMAL with precision and scale not specified represents a floating-point decimal number |
NCHAR |
CHAR |
|
VARCHAR(n) |
VARCHAR(n) |
|
NVARCHAR(n) |
VARCHAR(n) |
|
ALPHANUM(n) |
VARCHAR(n) |
|
SHORTTEXT(n) |
VARCHAR(n) |
|
CLOB |
VARCHAR (unbounded) |
|
NCLOB |
VARCHAR (unbounded) |
|
TEXT |
VARCHAR (unbounded) |
|
BINTEXT |
VARCHAR (unbounded) |
|
VARBINARY(n) |
VARBINARY |
|
BLOB |
VARBINARY |
|
DATE |
DATE |
|
TIME |
TIME(0) |
|
SECONDDATE |
TIMESTAMP(0) |
|
TIMESTAMP |
TIMESTAMP(7) |
All other types aren’t supported.
SEP to SAP HANA write type mapping#
The following write type mapping applies when tables are created in SAP HANA from SEP.
SEP type |
SAP HANA database type |
Notes |
---|---|---|
BOOLEAN |
BOOLEAN |
|
TINYINT |
TINYINT |
|
SMALLINT |
SMALLINT |
|
INTEGER |
INTEGER |
|
BIGINT |
BIGINT |
|
REAL |
REAL |
|
DOUBLE |
DOUBLE |
|
DECIMAL(p, s) |
DECIMAL(p, s) |
|
CHAR |
CHAR or NCLOB |
|
VARCHAR |
NVARCHAR or CLOB |
|
VARBINARY |
BLOB |
|
DATE |
DATE |
|
TIME(p) |
TIME |
|
TIMESTAMP(p) |
SECONDDATE for p = 0, TIMESTAMP otherwise |
All other types aren’t supported.
General configuration properties#
The following properties can be used to configure how data types from the connected data source are mapped to Trino data types and how the metadata is cached in Trino.
Property name |
Description |
Default value |
---|---|---|
|
Configure how unsupported column data types are handled:
The respective catalog session property is |
|
|
Allow forced mapping of comma separated lists of data types to convert to
unbounded |
|
|
Support case insensitive database and collection names |
False |
|
1 minute |
|
|
Duration for which metadata, including table and column statistics, is cached |
0 (disabled caching) |
|
Cache the fact that metadata, including table and column statistics, is not available |
False |
SQL support#
The connector provides read and write access to data and metadata in SAP HANA. In addition to the globally available and read operation statements, the connector supports the following features:
Views#
The connector can read data from views, including SAP HANA calculation views.
Performance#
The connector includes a number of performance improvements, detailed in the following sections.
Table statistics#
The SAP HANA connector supports table and column statistics to improve query processing performance based on the actual data in the data source.
The statistics are collected by SAP HANA and retrieved by the connector.
You have to use the CREATE STATISTICS
command in SAP HANA to initiate
creation and ongoing collection and update of the relevant statistics. You can
find more information about statistics collection in the SAP HANA documentation.
The connector and SEP support the statistic types HISTOGRAM
, SIMPLE
,
and TOPK
.
Note
The collection in SAP HANA can take considerable time and depends on the data size. You can use the MERGE DELTA command to affect availability of the statistics.
Pushdown#
The connector supports pushdown for a number of operations:
Aggregate pushdown for the following functions:
Additionally, for the aggregate functions below, pushdown is only supported
for DOUBLE
type columns:
Dynamic filtering#
Dynamic filtering is enabled by default. It causes the connector to wait for dynamic filtering to complete before starting a JDBC query.
You can disable dynamic filtering by setting the property
dynamic-filtering.enabled
in your catalog properties file to false
.
JDBC connection pooling#
You can improve performance by enabling JDBC connection pooling, which is disabled by default.
Security#
The connector includes a number of security-related features, detailed in the following sections.
Password credential pass-through#
The connector supports password credential pass-through. To enable it, edit the catalog properties file to include the authentication type:
sap-hana.authentication.type=PASSWORD_PASS_THROUGH
For more information about configurations and limitations, see Password credential pass-through.