All Apps and Add-ons

Cloud Snowflake DB Connect Integration

ekcsoc
Path Finder

Hi,

Could you please help here I tried to integrate the Snowflake with db connect.

I am using following version of JRE: /usr/java/jdk1.8.0_131/jre

I have placed snowflake at following location:

/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-libs/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/jars/snowflake-jdbc-3.9.1-javadoc.jar

I have put below stanza in /opt/splunk/etc/apps/splunk_app_db_connect/local/ db_connection_types.conf

[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcUrlFormat = jdbc:snowflake://:/
jdbcDriverClass = com.snowflake.client.jdbc.SnowflakeDriver

Now when I create a connection and try to save I get following error:

Database connection snowflake is invalid
No suitable driver found for jdbc:snowflake://"":443/

Labels (1)
1 Solution

ptang_splunk
Splunk Employee
Splunk Employee

Hi there,

Please try the following steps:

  1. Make sure you download the latest Snowflake JDBC Driver jar version and NOT javadoc.jar : https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc
  2. Drop the .jar file (driver) under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers
  3. Create or update db_connection_types.conf under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local with the following: [snowflake] displayName = Snowflake serviceClass = com.splunk.dbx2.DefaultDBX2JDBC jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database> ui_default_catalog = $database$ port = 443
  4. Restart Splunk
  5. Now, the Snowflake Driver should be available via the UI
  6. Create a new Identity with your Snowflake credentials
  7. Create a new Database Connection for Snowflake with the following:
  8. Snowflake Identity
  9. Snowflake Connection Type
  10. Select the Timezone set/used by your Snowflake Database/Environment
  11. Check the "Edit JDBC URL" checkbox, this will have to be manually provided as per https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html or something similar to jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params> OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema> Replace all the <parameters> to reflect your environment.
  12. Make sure that Read Only is unchecked as this parameter is not available on Snowflake
  13. Fetch size can be left alone

Your Database Connection should look as follows: Screenshot

The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!

View solution in original post

phoran
New Member

Can you post your connection string? (Leave out the sensitive parts, obviously...)
What is your Snowflake account? Remember that the Snowflake "account" is everything preceding the ".snowflakecomputing.com" in your account URL. There should be both an account name and a region identifier - like "mySnowflakeAccount.us-east-1." for AWS East. Only Snowflake accounts based in the AWS US-West region will not have a region identifier included.

-Paul Horan-
Snowflake

0 Karma

bwu_hcf
New Member

The JDBC URL is below:

jdbc:snowflake://xxx.ap-southeast-2.snowflakecomputing.com:443/?warehouse=XXXXX_WH&db=XXXXXXXXX

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

I will assume you followed Snowflake’s post on the setup? Have you also reached out to their side for assistance?

https://support.snowflake.net/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

0 Karma

bwu_hcf
New Member

Thanks for reminding me! That is actually my next step as I am suspecting they could be blocking this type of connection from their end by default.

0 Karma

phoran
New Member

Snowflake will only "block" connection attempts with invalid credentials, or coming from network IP addresses outside your "whitelisted" network policy (should one have been created).
This is more likely an invalid connection string.

-Paul-

0 Karma

bwu_hcf
New Member

Thanks, @phoran ! When I tried to create such connection via Splunk DB Connect GUI, it did show "This is an invalid connection" when I was saving the settings. If I ask our Snowflake counterpart to add our IP to their whitelist, this should resolve the issue. Am I correct?

0 Karma

phoran
New Member

If you're trying to connect from outside the whitelisted IP range, the error message from Snowflake is "IP XX.XX.XX.XX is not allowed to access Snowflake. Contact your local security administrator."

The error you're seeing sounds more like a malformed connectionstring.

-Paul-

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

I see you have a support case open on this correct? Did you follow the guidance they suggested regarding the link and the timeout behavior?

0 Karma

bwu_hcf
New Member

Yes, I had captured tcpdump from Wireshark and will attach the output to the case for further analysis. I have also increased the timeout value to "maxWaitMillis=1800000" to test the connection again, but still no luck. Same error messages as the above internal logs. I am upgrading the jdbc driver to the latest to test again.

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

What is the version of the JDBC driver? Please provide the link to whichever one you downloaded. Thank you.

0 Karma

bwu_hcf
New Member

I am currently using snowflake-jdbc-3.12.4.jar driver downloaded from the site below:

https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

Are you using telnet to test from your client PC or from where DBX is installed? Just curious?

0 Karma

bwu_hcf
New Member

Telnet testing is from the Splunk HF where DBX is installed.

0 Karma

ptang_splunk
Splunk Employee
Splunk Employee

Hi there,

Please try the following steps:

  1. Make sure you download the latest Snowflake JDBC Driver jar version and NOT javadoc.jar : https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc
  2. Drop the .jar file (driver) under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers
  3. Create or update db_connection_types.conf under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local with the following: [snowflake] displayName = Snowflake serviceClass = com.splunk.dbx2.DefaultDBX2JDBC jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database> ui_default_catalog = $database$ port = 443
  4. Restart Splunk
  5. Now, the Snowflake Driver should be available via the UI
  6. Create a new Identity with your Snowflake credentials
  7. Create a new Database Connection for Snowflake with the following:
  8. Snowflake Identity
  9. Snowflake Connection Type
  10. Select the Timezone set/used by your Snowflake Database/Environment
  11. Check the "Edit JDBC URL" checkbox, this will have to be manually provided as per https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html or something similar to jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params> OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema> Replace all the <parameters> to reflect your environment.
  12. Make sure that Read Only is unchecked as this parameter is not available on Snowflake
  13. Fetch size can be left alone

Your Database Connection should look as follows: Screenshot

The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!

thambisetty
SplunkTrust
SplunkTrust

I am aware there is an accepted answer that helped me, but the connection was not established with the message 'need further details.' However, I need to do some extra configuration to establish the connection.

I followed accepted answer for jdbc driver(downloaded snowflake-jdbc-3.15.1.jar ) and to configure db_connection_types.conf ( don't copy and paste it from accpeted answer. see below db_connection_types.conf

Followed below link to fix the issue: could not initialize class net.snowflake.client.jdbc.internal.apache.arrow.memory.util.MemoryUtil
https://community.snowflake.com/s/article/JDBC-OutOfDirectMemoryError

I added the first two options from the above solution (you can see below in case the above link is broken or removed) to the JVM Options in connection settings with a space delimiter. It didn't work. However, the connection got established with the third solution. (See db_connections.conf also to understand how that parameter is used.)

 

1.) Increase the maximum heap size, which in return will increase the maximum direct memory size. You will need to refer to the application's documentation for instructions on how to configure this value because it is application-specific. If you were starting the application using the java command then any of the following JVM arguments will set the maximum heap size to 1 GB:

-Xmx1048576k
-Xmx1024m
-Xmx1g

2.) Explicitly increase the maximum direct memory size. E.g., the following JVM sets the value to 1 GB:

-XX:MaxDirectMemorySize=1g

3.) If for any reason you do not have any control over the amount of memory you can allocate to your JVM (e.g., you are limited by the size of the container you're running in and it cannot be configured) then change the query result set from ARROW to JSON.

You can pass this setting as a connection parameter using your JDBC driver:

JDBC_QUERY_RESULT_FORMAT=JSON

 


Final db_connections.conf and db_connection_types.conf

 

local/db_connection_types.conf
[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc‌‌//:/?db=
ui_default_catalog = $database$
port = 443

local/db_connections.conf
[Snowflake_DB]
connection_properties = {"JDBC_QUERY_RESULT_FORMAT":"JSON"}
connection_type = snowflake
customizedJdbcUrl = jdbc‌‌//.snowflakecomputing.com:443/?user=&db=snowflake&warehouse=&schema=public
database = snowflake
disabled = 0
host = .snowflakecomputing.com
identity = SnowflakeUser
jdbcUseSSL = false
localTimezoneConversionEnabled = false
port = 443
readonly = false
timezone = Etc/UTC

 

 

 

————————————
If this helps, give a like below.
0 Karma

thambisetty
SplunkTrust
SplunkTrust

I am aware there is an accepted answer that helped me, but the connection was not established with the message 'need further details.' However, I need to do some extra configuration to establish the connection.

I followed accepted answer for jdbc driver(downloaded snowflake-jdbc-3.15.1.jar ) and to configure db_connection_types.conf ( don't copy and paste it from accpeted answer. see below db_connection_types.conf

Followed below link to fix the issue: could not initialize class net.snowflake.client.jdbc.internal.apache.arrow.memory.util.MemoryUtil
https://community.snowflake.com/s/article/JDBC-OutOfDirectMemoryError

I added the first two options from the above solution (you can see below in case the above link is broken or removed) to the JVM Options in connection settings with a space delimiter. It didn't work. However, the connection got established with the third solution. (See db_connections.conf also to understand how that parameter is used.)

1.) Increase the maximum heap size, which in return will increase the maximum direct memory size. You will need to refer to the application's documentation for instructions on how to configure this value because it is application-specific. If you were starting the application using the java command then any of the following JVM arguments will set the maximum heap size to 1 GB:

-Xmx1048576k
-Xmx1024m
-Xmx1g

2.) Explicitly increase the maximum direct memory size. E.g., the following JVM sets the value to 1 GB:

-XX:MaxDirectMemorySize=1g

3.) If for any reason you do not have any control over the amount of memory you can allocate to your JVM (e.g., you are limited by the size of the container you're running in and it cannot be configured) then change the query result set from ARROW to JSON.

You can pass this setting as a connection parameter using your JDBC driver:

JDBC_QUERY_RESULT_FORMAT=JSON


Final db_connections.conf and db_connection_types.conf

local/db_connection_types.conf
[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database>
ui_default_catalog = $database$
port = 443

local/db_connections.conf
[Snowflake_DB]
connection_properties = {"JDBC_QUERY_RESULT_FORMAT":"JSON"}
connection_type = snowflake
customizedJdbcUrl = jdbc:snowflake://<host>.snowflakecomputing.com:443/?user=<user_name>&db=snowflake&warehouse=<warehouse_value>&schema=public
database = snowflake
disabled = 0
host = <host>.snowflakecomputing.com
identity = SnowflakeUser
jdbcUseSSL = false
localTimezoneConversionEnabled = false
port = 443
readonly = false
timezone = Etc/UTC

 

————————————
If this helps, give a like below.
0 Karma

thambisetty
SplunkTrust
SplunkTrust

I am aware there is an accepted answer that helped me, but the connection was not established with the message 'need further details.' However, I need to do some extra configuration to establish the connection.

I followed accepted answer for jdbc driver (downloaded snowflake-jdbc-3.15.1.jar ) and to add db_connection_types.conf (don't copy and paste it from the accepted answer. see below db_connection_types.conf)
However, I ran into a problem with error message : could not initialize class net.snowflake.client.jdbc.internal.apache.arrow.memory.util.MemoryUtil

Followed below link to fix the issue : 
https://community.snowflake.com/s/article/JDBC-OutOfDirectMemoryError

I added the first two options from the above solution (you can see below in case the above link is broken or removed) to the JVM Options in connection settings with a space delimiter. It didn't work. However, the connection got established with the 3rd solution. (See db_connections.conf also to understand how that parameter is used.)

 

1.) Increase the maximum heap size, which in return will increase the maximum direct memory size. You will need to refer to the application's documentation for instructions on how to configure this value because it is application-specific. If you were starting the application using the java command then any of the following JVM arguments will set the maximum heap size to 1 GB:

-Xmx1048576k
-Xmx1024m
-Xmx1g

2.) Explicitly increase the maximum direct memory size. E.g., the following JVM sets the value to 1 GB:

-XX:MaxDirectMemorySize=1g

3.) If for any reason you do not have any control over the amount of memory you can allocate to your JVM (e.g., you are limited by the size of the container you're running in and it cannot be configured) then change the query result set from ARROW to JSON.

You can pass this setting as a connection parameter using your JDBC driver:

JDBC_QUERY_RESULT_FORMAT=JSON

 

 
Final confs are as below: 

 

 

local/db_connection_types.conf
[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc‌‌//:/?db=
ui_default_catalog = $database$
port = 443

local/db_connections.conf
[Snowflake_DB]
connection_properties = {"JDBC_QUERY_RESULT_FORMAT":"JSON"}
connection_type = snowflake
customizedJdbcUrl = jdbc‌‌//.snowflakecomputing.com:443/?user=&db=snowflake&warehouse=&schema=public
database = snowflake
disabled = 0
host = .snowflakecomputing.com
identity = SnowflakeUser
jdbcUseSSL = false
localTimezoneConversionEnabled = false
port = 443
readonly = false
timezone = Etc/UTC

 

 

 



————————————
If this helps, give a like below.
0 Karma

bwu_hcf
New Member

Hi @ptang_splunk ,

Thanks for providing detailed steps above! I had configured our DB Connect inputs for Snowflake as described by your post as well as following the doc below from Snowflake:

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

I am able to telnet to port 443 on the host as specified in the "JDBC URL" from the Splunk HF that the DB connection is initiated. However, I still encountered the below errors from internal logs (splunk_app_db_connect_server.log) after restarting splunkd service:

2020-05-30 00:03:19.220 +1000 [QuartzScheduler_Worker-10] WARN com.zaxxer.hikari.pool.PoolBase - unnamed_pool_1112188830_jdbc_snowflake//XXX.XX-XXXXXXXXX-X.snowflakecomputing.com443/?warehouseSPLUNK_WH&dbSNOWFLAKE - Failed to register management beans.
javax.management.RuntimeOperationsException: null
at com.sun.jmx.mbeanserver.Repository.addMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(Unknown Source)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(Unknown Source)
at com.zaxxer.hikari.pool.PoolBase.registerMBeans(PoolBase.java:258)
at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:116)
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:94)
at com.splunk.dbx.connector.ConnectorFactory.getConnectionFromPool(ConnectorFactory.java:201)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:189)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.lang.IllegalArgumentException: Repository: cannot add mbean for pattern name com.zaxxer.hikari:type=PoolConfig (unnamed_pool_1112188830_jdbc
snowflake//XXX.XX-XXXXXXXXX-X.snowflakecomputing.com443/?warehouseSPLUNK_WH&db_SNOWFLAKE)
... 19 common frames omitted

2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
java.sql.SQLFeatureNotSupportedException: null
at net.snowflake.client.jdbc.SnowflakeConnectionV1.setReadOnly(SnowflakeConnectionV1.java:518)
at com.zaxxer.hikari.pool.ProxyConnection.setReadOnly(ProxyConnection.java:388)
at com.zaxxer.hikari.pool.HikariProxyConnection.setReadOnly(HikariProxyConnection.java)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:190)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] INFO org.easybatch.core.job.BatchJob - Job 'SNOWFLAKE_ACCOUNT_QUERY_TEST' finished with status: FAILED

As the above error trace did not give out much information on why the connection failed apart from the 2 messages below:

"Failed to register management beans"
"Unable to open record reader"

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...