All Apps and Add-ons

Splunk DB connect ver 3.1.1 shows input error.

yutaka1005
Builder

In my environment, I have configured input by using DB connect to SQL server.
Then after checking the log, the following error was continuously being outputted.

[QuartzScheduler_Worker-14] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
com.microsoft.sqlserver.jdbc.SQLServerException: The query has timed out
at com.microsoft.sqlserver.jdbc.TDSCommand.checkForInterrupt(IOBuffer.java:6498)
at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:67)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.<init>(SQLServerResultSet.java:310)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1646)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:426)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:372)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:6276)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1794)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:184)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:159)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:284)
at com.splunk.dbx.connector.connector.impl.JdbcConnectorImpl.executeQuery(JdbcConnectorImpl.java:291)
at com.splunk.dbx.connector.connector.impl.JdbcConnectorImpl.executeQuery(JdbcConnectorImpl.java:331)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:80)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

However, I don't know it is the complete log, but the log is being inputted every time.
Is this error due to query timeout setting related to answers below?

https://answers.splunk.com/answers/506917/dbconnect-2-input-timed-out-can-i-increase-the-tim.html

Also will some logs be captured even if timeout occurs?

It would be greatly appreciated if anyone could tell me.

0 Karma
1 Solution

DavidHourani
Super Champion

Hello Yutaka,

A timed out query usually means that you failed to connect to the service port due to a port being blocked by a firewall. Please try running the following from your dbconnect's host CLI:

telnet yourdbserver portnumber

If this fails that means there is something blocking and you should check with your firewall team where the traffic is being denied.

Regards,
David

View solution in original post

0 Karma

DavidHourani
Super Champion

Hello Yutaka,

A timed out query usually means that you failed to connect to the service port due to a port being blocked by a firewall. Please try running the following from your dbconnect's host CLI:

telnet yourdbserver portnumber

If this fails that means there is something blocking and you should check with your firewall team where the traffic is being denied.

Regards,
David

0 Karma

yutaka1005
Builder

Thank you for answer!

Since data is actually captured,
It is hard to think that connection is blocked.

If the connection is not blocked, ie the port is free,
Again the following settings in "db_input.conf" are relevant?

query_timeout =

0 Karma

DavidHourani
Super Champion

Can you please post an extract of your db_inputs config file ?
If you're using incremental db input config then you are not going to lose data since when the input will work it will take everything after the checkpoint and store the new checkpoint value for the next successful import.
Changing the query_timeout can help, to double check if it's going to change something test a very large dbxquery see if it works. If it does then that's not your problem. It could be that your DB has some restrictions on the max amount of queries it can receive per day from your account or a limit on the max batch size that can be exported 🙂

0 Karma

yutaka1005
Builder

Thank you for answer.

Excuse me. It is difficult to release the setting value.

Yeah I'm using "rising column input", so I never lose data.
However, if a timeout occurs, my alert may miss the data that is should be detected.

So I think I avoid it by changing the value of query_timeout.
Also I checked the DB side timeout value, but it defaulted to 10 minutes, so the problem is the timeout value of Splunk DB connect side.

Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...